var/home/core/zuul-output/0000755000175000017500000000000015156636500014534 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015156643556015511 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000234424015156643405020270 0ustar corecoreGikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gfͅ R~2i.߷;U/;Yw?.y7W޾n^X/ixK|1Ool_~yyiw|zxV^֯v5gCh31 )Kh3i J1hG{aD4iӌçN/e] o;iF]u54!h/9Y@$9GAOI=2,!N{\00{B"唄(".V.U) _.f*g,Z0>?<;~9.뙘 vKAb;-$JRPţ*描Լf^`iwoW~SL2uQO)qai]>yE*,?k 9Z29}}(4ҲIFyG -^W6yY<*uvf d |TRZ;j?| |!I糓 sw`{s0Aȶ9W E%*mG:tëoG(;h0!}qfJz硂Ϧ4Ck9]٣Z%T%x~5r.N`$g`Խ!:*Wni|QXj0NbYe獸]fNdƭwq <ć;_ʧNs9[(=!@Q,}s=LN YlYd'Z;o.K'_-הp|A*Z*}QJ0SqAYE0i5P-$̿<_d^"]}Z|-5rC wjof'(%*݅^J">CMMQQ؏*ΧL ߁NPi?$;g&立q^-:}KA8Nnn6C;XHK:lL4Aْ .vqHP"P.dTrcD Yjz_aL_8};\N<:R€ N0RQ⚮FkeZ< )VCRQrC|}nw_~ܥ0~fgKAw^};fs)1K MޠPBUB1J{Ⱦ79`®3uO0T-Oy+tǭQI%Q$SiJ. 9F[L1c!zG|k{kEu+Q & "> 3J?5OͩLH.:;?֖QʡCOx]*9W C;6)SCVOאUʇq )$ {SG!pN7,/M(.ΰdƛޜP16$ c:!%Piocej_H!CEF L훨bِp{!*({bʂAtĘ5dw9}ŒEanvVZ?B巻?qr7@sON_}릶ytoy͟מseQv^sP3.sP1'Ns}d_ս=f1Jid % Jwe`40^|ǜd]z dJR-Дxq4lZ,Z[|e 'Ƙ$b2JOh k[b>¾h[;:>OM=y)֖[Sm5*_?$cjf `~ߛUIOvl/.4`P{d056 %w ^?sʫ"nK)D}O >%9r}1j#e[tRQ9*ء !ǨLJ- upƜ/4cY\[|Xs;ɾ7-<S1wg y &SL9qk;NP> ,wդjtah-j:_[;4Wg_0K>є0vNۈ/ze={< 1;/STcD,ڙ`[3XPo0TXx ZYޏ=S-ܑ2ƹڞ7կZ8m1`qAewQT*:ÊxtŨ!u}$K6tem@t):êtx: `)LPeP/d bP6k:Rǜ%V1Ȁ Z(Q:IZaP,MI6o ޞ22ݡjR:g?m@ڤB^dh NS߿c9e#C _-XѪ;Ʃ2tStΆ,~Lp`-;uIBqBVlU_~F_+ERz#{)@o\!@q['&&$"THl#d0 %L+`8zOҚƞ`wF~;~pkѽ)'cL@i]<ք6ym®Yi&s`dyMX](^!#h k:U7Uv7чd)wB5v-)s蓍\>S[l52, 5 CۈP$0Zg=+DJ%D  *NpJ֊iTv)vtT̅Rhɇ ќuގ¢6}#LpFD58LQ LvqZDOF_[2ah?lm$K/$s_. WM]̍"W%`lO2-"ew@E=0D"\KjPQ>Y{Ÿ>14`SČ.HPdp12 (7 _:+$ߗv{wzM$VbήdsOw<}#b[E7imH'Y`;5{$ь'gISzp; AQvDIyHc<槔w w?38v?Lsb s "NDr3\{J KP/ߢ/emPW֦?>Y5p&nr0:9%Ws$Wc0FS=>Qp:!DE5^9-0 R2ڲ]ew۵jI\'iħ1 {\FPG"$$ {+!˨?EP' =@~edF \r!٤ã_e=P1W3c +A)9V ]rVmeK\4? 8'*MTox6[qn2XwK\^-ޖA2U]E_Dm5^"d*MQǜq؈f+C/tfRxeKboc5Iv{K TV}uuyk s" &ﱏҞO/ont~]5\ʅSHwӍq6Ung'!! e#@\YV,4&`-6 E=߶EYE=P?~݆]Ōvton5 lvǫV*k*5]^RFlj]R#Uz |wmTeM kuu8@8/X[1fiMiT+9[ŗ6 BN=rR60#tE#u2k *+e7[YU6Msj$wբh+8kMZY9X\u7Kp:׽ ^҃5M>!6~ö9M( Pnuݮ)`Q6eMӁKzFZf;5IW1i[xU 0FPM]gl}>6sUDO5f p6mD[%ZZvm̓'!n&.TU n$%rIwP(fwnv :Nb=X~ax`;Vw}wvRS1q!z989ep 5w%ZU.]5`s=r&v2FaUM 6/"IiBSpp3n_9>Byݝ0_5bZ8ւ 6{Sf觋-V=Oߖm!6jm3Kx6BDhvzZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,W˲vtv|,SԽ[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXskwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnTV*P,rq<-mOK[[ߢm۽ȑt^, tJbظ&Pg%㢒\QS܁vn` *3UP0Sp8:>m(Zx ,c|!0=0{ P*27ެT|A_mnZ7sDbyT'77J6:ѩ> EKud^5+mn(\)"K{v@`<ۃ7|qk" L+Y*Ha)j~pu7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݻ<̍8)r`F!Woc0Xq0 R' eQ&Aѣzvw=e&".awfShWjÅD0JkBh]s9Ą|ק_;%X6Q@d 8&a)a.#ۿD> vfA{$g ăyd) SK?ɧo\戔-QB EM;oH$$]?4_~YrXY%Ο@oHwlXiW\ΡbN}l4VX|"0]! YcVi)@kF;'ta%*xU㔸,A|@WJfVP6`ڼ3qY.[U BTR0u$$hG$0NpF]\ݗe$?# #:001w<{{B\rhGg JGIެE.:zYqY{*2lVǻXEB6;5NE#eb3aīNLd&@yz\))H;h\ߍ5S&(w9Z,K44|<#EkqTkOtW]﮶f=.*LD6%#-tңx%>MZ'0-bB$ !)6@I<#`L8턻r\Kuz*]}%b<$$^LJ<\HGbIqܢcZW {jfѐ6 QڣPt[:GfCN ILhbB.*IH7xʹǙMVA*J'W)@9 Ѷ6jىY* 85{pMX+]o$h{KrҎl 5sÁbNW\: "HK<bdYL_Dd)VpA@A i"j<鮗 qwc&dXV0e[g#B4x╙✑3'-i{SEȢbK6}{Ⱥi!ma0o xI0&" 9cT)0ߣ5ڦ==!LgdJΆmΉO]T"DĊKٙ@qP,i Nl:6'5R.j,&tK*iOFsk6[E__2pw=͠qj@o5iX0v\fk= ;H J~.,v%Rwó^;n1z"8 P޿[V!ye]VZRԾ|“qNpѓVZD2"VN-m2do9 'H*IM۟}J ZaG%qn*WE^k1v3ڣjm7>ƽl' ,Τ9)%@ wl42iG.y3bBA{pR A ?IEY ?|-nz#}~f ‰dŷ=ɀ,m7VyIwGHέ 2tޞߛM{FL\#a s.3\}*=#uL#]  GE|FKi3&,ۓxmF͉lG$mN$!;ߑl5O$}D5| 01 Sοtq6cl]M[I5'ոfiҞ:Z YՑ"jyKWk^dd@U_a4/vvV qHMI{+']1m]<$*YP7g# s!8!ߐ>'4g7/KwΦθW'~?x0_>9Hhs%y{#iUI[Gzįx7OnuKRv'm;/~n-KI`5-'YݦD-!+Y򼤙&m^YAKC˴vҢ]+X`iDf_ XC.l.;oX]}:>3K0R|WD\hnZm֏o/p};ԫ^(fL}0E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G7u5/>HB)iYB B @W,v&- 6wCBjxk9ᤉ ,Asy3YޜZ4ΓVYf'hk $L~ԥ qކ );B̗ߐu&8c`td 1xh˂U4UoӋQ`IRҴ225UY5li_ r9v!Go3 "ӎk[8L%H㸡]V.;lM>*2 5I bHb3Lh!ޒh7YJt*CyFÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxлo_nfz61!iaI%~`{Tռl>~,?5D K\gd(ZH8@x~5w.4\h(`dc ^}Ծե4Nao_}=\:Nb{sIStgq$<$il辊sJɪ`E@+B@IFQ BFIg@5ZP[F,Gi7Y{rp`b Y1LȤѧ=+kcBacPXZs'UM}/}6 _aM;Iw]Nk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc忍)cl*&<}P 9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY }U1ÒJG9TV\/B{MӨ&Ę4.s"x| 6گ4n;[4E8#~p9u3Np> x삖A7 u/~&ӄMu.<|yi I?@)XJ7{ޱݏ(e)+MGK'#+սE^dˋf6Y bQEz}cҏnr_ ^O^W zw~Ȳ=sXअy{E|Mp' *=]Q$b =ݧ;6q6^9.EPHŽ{pN>`cZV yBJHVuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*corD~@N`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->'I`{6_cFTQuNHlڢF,)zؿvK1V'&~@ӏ$%1HcTI~DUa2,jSW%id~_ՠVbU!J%ʀwTYIJ7@(J%\Ri~)bԢ ͽYq{\~*ϓB6Id_M|ުA2WsZ'eJ,y }oՌq$}Uf*ZE*8_OM&ӽ;ͱ@U]aG4S'i?'2K'&1qQM^b^d'T7f?!vz_%L+d|G OIوgn8Q;/ ȲW \߶"u4*2#+uq-/v<.\b7BaA䇯+[Ŏضx&82k`t׋%E$Ҥ5rn_øI^{@V"/oA욃9D|O8Aw%qD5h=4*eM/y:'']ֿ<6"S+ҽ8vuE8#'+{yv+QH'gŒGdI?_ƧeJi*39eC1KZ7`ad+LCppyG>G'ӹ(& fFu )iD?2$oOc۴i_t/cǴ|5ILV_GeׂV{,=Pc47*Of㹞V=uX}O+v"B$?U衼cXbK"eDbY~V7ludrݽ hL'ESXjčwA>ŤWBi3K`XLl@%"DUu!n*v=3Pl; }g5/Qy(U?23sE"} YJ2v]i1| d8k'4E:ϛ!oƁ'Eq5B.1gj'PXPQ{}e_UG"`.꺨ṆjRꪚO %oIU.қGQ9P~ŀŀ_P5=Ob3 [FQ(C;<2)H QGh g O~?}9#} DdlOMg{cwG?uWmYS 2*`DCYS^XU3Njz6.Vr OytQ<vtԀKc]5 AO:ɦߎ}i|tJR?NJa]6 ˶YQNqrxۃ9ɳ*Oyt.TԔI} [#tQT.jxyEl/l :Y||ф`?/(lC"["n2U_ՅAyZLb[[C,whRq,I7]#RC.(N6Ǭ1!ۣ"KnZSٯ6w{:2AyOD"od@y֠Γ n0u. V[nj'0 x??Q^GiPM )0s{ǢTg ܊-xU&kAao︶.Xdg"'UB7M`ϺGfJ?i=; "&O,tqo{mNHdXeky۠#2+89gJS^e.gK* XJ kHqrN$ b>f\doь9V*RgmlA七%1 "Aa0z-&{Hu>q3r ZVMq$ ȼWQH.]s0c@{N+[yO]ޙ+tWCa{4dvLM9)eID0­=4?eSԫD5 =`.m&ItHui~lZfGC 牜:H|*#i OfMvu%F `Ž]GsD55 VhJ Fu Fs5t`IdQ0k{4MC5!Hd{`Ry40Ce"iP T5cwF :Z;d>Ӡiʤ_ܶJ5 7bǐtxн^h OF*p7!U[yEԑL0/tu/qjy# hO* ΘW9 Z?K,qTָNXXce 絿2a=ӑ˻ -W o MN ҭ`Efm۔oO4> 4@֦$hX@lwS FzUȕ7h ( YKtEcDqe|S6 A܈\1ќf.wDv=RFޖ+}wsښ4 a7D]CaH jmANaA3}!>ڌ8@ٱ7#]?O_l b6˃YCt–M4Ga%W tæQ\w@$V)< #bѤ!"ݫaZk ƒֆÛ,sS}; שHT`CfRu)*y'g_~/ 1(3n`"!\7c{6]F2R2m[tBH`zFX"IUـbu- eKލYٍZ$M8ѡvs2m:ϪfRWJ%C @5W\bDRѾ<KLa) s5kcI&jxU D$ %Ty/h;٦u[Oc hY3 8^y`iS}Ns<]b]`)6E6\yDkuSϳmHyp}sn]sEXF1Иywq`X}2lH|"Ep$u$KbUFSD+lsZ+ҏV7ORDC3%'?4 \ll'yW-{|+K E /j5٦؇.5"bɇvݚnK 8V`-[;O[o+</7+B huV%1z:v><܈Vf~.&yT 66 NJ⿫q8\e"cB,Zq`F~رp-2}yā ݕ7A/ `&1Swf%5`δH&0z TD--.HWeF/萑gS$%e:-pQCf2+U4SICt:/qOC~$ږ#urDz-2Z4L"qCbGO%-+9~fk`ëK]E bm]f۝v.P+,"5-kigWI7e\AX]q7zዼ!u JEQ>/ r]&ApѤ;fgjY.-swqE22<yX6~dsoIظeou3 1P|>E&;3%#Msoy;#OU'!2s*'z ZF{Ύ cD266Pv6tf{TFٽ3;2ZָݝmٙF[;5A[sIrM5 KnU&j7흚0us>y* vc7osܙ&ܲŭFE1;2[sXmR9Zˊ1hߺf} [=jA>Rl--} ?OՏ?=0&5 B%QaA]"XuC! _e/C LŕA#Z Pc+w 1ifF(~1l4ВW3pX'J "v#x׆!FKA{N,hЎJL95㈝n#Yk(awC* B0Ntt }\X[ל-_UҠyqFD0_ B1 =C OoDbyV*茭<n@,ñө2DdӒ (NMnm">iٜ}AWTZW{&=g`4jӳ/aj{&ɹ5s y39Ֆ^vB=t7o۱7Pҵ=}w?ʹAfx VFGnbC{ȳE߱A#nVE?'lmj=M Mmȝ|CЍNIkY[l.#l] ⴩Amjy$_Ae/˯-[pAO"Wf $UjJXfep,C_c ޿~=s UQEDQ[|DWY)5)Kt!Y<8|3+R)2{={!ԍ\рCKd s= ń0$4,<'$$_F 0tޤ|#IBzJ2pED#v|dgXf%*?oc@[aS2j\;꼚(R 87x)5G*XS\OS߶LbBUctej'HbHԩFCّqr!b9g3UV=P'%1bn02\_Ϥ [;`^_ݐz8:\?4h^x\G! b7Ⱥ+sU.+rS "3"i>KO`%X]U3H O|>+`%j:@/皭\VW՗xLɌAQ>sYk"7.@,*E {t6P܁]JFkF u! z[6) |͓3<6ǁU9/wX9PTpDdhYi⪸`USUޖs1IM?az߲z(!4{ 㥼7c8H'GT\ةd[( J}p !B.,-tH㺉Ϛg]]UWȯ&%{)DE({^DL)|r3|2dg!o!`&j˫ulY{8u-_aѵ<\X'y(?0[UV(jUpa`CJ񧹳~J2yYig"Tr8#Y^c }9wjC'2; N@O?"i^E X?`HNȻ ߧW%^:'OYpx-)y@ek'bΔ3mubPF][G7ɻ_Nz0e07{y#ͭ$ƨpgCwрq6NpKoc4 @Ϸi?0 /'L uj""d96TSY` l4ɕHp+"&Yݰ良8 (DEnJd.YQ_hϦ`n ||{V{^QFNQ D=lӀ9 0ƧGN4d _܀5E>E?MdR#84D?Wu٫U^Ua}ɇ|QFqz+b+ڢF>\a2n6#UC[q-6AKVPǽME5AUiUm@ʹATl/9*b᯺PJ(ۂP=i e֥l>e;t*{b]mկUҵO۝;w-ʟF*ߑR*j?P{GB-uV u ٞPi:;lAJO#ݑPw BUB-'{ގz[oA=w$߂`` B Fh#[nOh4B 7KL%^"oeLςbUswxf4.b_l4Z=8L\ge^it&_9I.s4zQوFD\Kc0*,EQHJj!Q_Èh)>sp>qxޅW<Φx_Ϡ{0+RtUeޟЁ*wV=Jp r+޻p~yŤWm_Mqj&%*,E2' !=*wsYڳ*CIRY( @g$=RpJPvZ[)9V po# UV?ǖRj+`=ֺ rSHj +8`V(>}Bo8p=ˁgd3lB& aA1i\`-=bWH|D߻>Y0D=)M2ɪ ,?3J g跗UcW=&ƕn ;5M'ĬxYZ&l fhr026{Įj5~,,gM9Ϋu/%fWi6g '3E-#MmS*-)d&cuj`&8\n<+ 9]m}MP6f}6ʴt~C[V# Y8k6YDywGJPj[N#O3ta蠘M{jWe5jI9BqmJ? R`9PAИU3DʐjN#k4ո#מz e eކ i}j ^!fz!yOJFtƐ LIatCw#Z" )+˳/*;x[ui*,RIh{ȿf>x(JN@ez%˸T ,dH64iy-vMyHNݲez\E̓ 8ʖ9irTxiZ^W6:VBr,NzSJ#,e K3Zd4ÇA=CW?IBnG܎䁓\VT+K\BӼ K>YCk)?.̲m^ns g*Ym{OlYLMALmpI4eBR0RQQdм^I| ¼%h.Fc!fCBj R2 6u[?ȄQw_ B\(ԇTm#/(0-0?*VO15XǙ t9n{MFH=HOAnlu3^R7pk{'j.4:sYK$;'| ߣf6A3/=  *X B1G 7{P!zMFlO~ˋI<ޓt`ѣ=w.Lށ'QЕkg* JÌ^f8c2%a]f^Q a5~ګrhX͸ yfS+j6 e\{فdJ?  ggx_z}mCNjR׊ٙ݋BaZ_򮥷$IccX 2-JTdo$EբOudw}Pۂ"b'N?QJq#]8ZoWy{=sIO_o?FO7x" K8"9=ɷ~)5x|9?:>P3ɿw]6!MJDoYo6IrJbS v6݇3e߿&(Ev6v~:9h KoŤ2nla_bbWaZuT|o_! oݬbi$R͇חlUpBL N=NosMہכܗ^0::9Nm|!굣ZF/ft))X4٤3V?b$|l{CaRakf'E \]I2+db:9ÂS.ۨ [#F; d\|HpPFC*Ȫi#5KF'd[*z#7ى$Y5:kxA+xLkYZ0#IJUٖ :S%%Ώ{A&5 ^&Fn,w|1.NI(cif%DdZ&&GZZ52 pڻ,BF#Ĉav,>v4rc#gZ5.4?Y0g)W"* HE sжhLF5q Vf 4K޶ 2X# .HbA-"5Wr̘n7\YbKaDm1A7/.}&Џ6Uy+hNR2 \:S H\틡 3ӡf悂`BڐTLl2Ԁ#ǒvO7$8 O[zd^3o]_bj @pN# ,GX6BŔ Y3 Dif0!:_X'4K! 4fK>CBE|"1 N:1/b5K6&mQE∑t$ʼn)x`XM`u#fY"Wg[.#2F Yv.r!}S%nZX`216Y+t7JޑhܢJS$@m Ύ)`Z\5ނV"Fҥ=t/$~ŅĚFOe^Ĕ6p*cJ1uG^OBzd`&LLEȓp͖>b$$9*Ao1y SŠ1` B^gbcruՊ9HS/#0F yL\TQzĺۮ MD~zNN%.ՅbhR1R/_נLy$'+"cOoj`wDМh௭ݲM[du!ƒl>8׌#y?L s <녳]`p g&Be+<"Q2( +):dH})ң?CƙkFԢ fQI?6BPEY^Y*CנunuHIK =#=|K ǩV#A;BCkˍd缈蘗0ǭQ3 tI@/$8z(gpNsb0O8%};Ehp_sqYonq .,z>9&oa섳E{_SelY5dED{])K]b2hbİ'1wp38M`ZNSxiӡٱe W!sVX7FXK:Dl6Fe}_ ʝm!+w˘Fν;#F>& r` Y`RWiТ^)&agbnusooW _ta٬+|p\gTfځ*{@dFcC]Ћnw[ßV=?;`U]54$fUAѳfȫEOߩjk-@K}0?0^P* Bs# 6 lHFç5 \3Oץ8|\:BBvyz3ȃ(@ԃ, WCWVE}f,IԨ#qܴn#&LF* Sfm uWۍO\T_QcXSwLSIi)Ʀƒ'^o̴Q `9.yR0.a[hpeC:[!ZGK,Aǒ<xȰ#B/e$/+㩧 "׶jliV@2y.D[es۱kD;rD`,ZOb%8cNy]ɳ S\"Gaa$|biu]@AQn0":/*?DP)np2@WvF]_m)<`.#y?E ͑C848@<6)!jsެRU[4irqFXU2 6gޱR`oQ~, \3ɥ~QeMS3*',; 'V̚N7*~6~!Vjm2.0A{4dD[817 $!a$](l3 TdV7YZ)ƪ >0XM ?J$08DyҿL!k+'#&jMGE@hˊM5$8V7Ut2 32e-F zloBg4<k. vR}Ű`f3 $-X䔃Ká N>d# YBq8q'vjsOБuCi1 sU$;O+kx,Xlڻ, 3+-(#j1T)t&ǢRV:bSf-S}8S'<Ҙ!V! B҅_ɿ[ h-ioKZ'|Zeu ~A$ԎYe]u9 ݘ5o+('PA*EuDNZEm:0+Vw}3xL_;w'ՄI5bftT]0QB^X20Hh*JB`P4K}"q4e6_;uk# rsGlbkS#GNa (^<Ą5{il5$A$ 5 5O3W\m'M?} 8 WBh.}Ϙ8T01vfx_]̟V}N˱g$Hoe hS{'jqz/Hpܞn<&b𺓹'%/̄ V D6s+mA`r3H} 1nI0 ZF(RŎf~Ou7%K/dwN9[Ǐfr]|8b>Ǘ7zQ!;Gif"`o?:NM2f*<}jb^JYSdnxuD$r݌?NcL %?ptӈ`Q1n÷b: ȭ-{mj)-hzz^["$i=&"ߜ2[@ICsws Z>>]7 8  K:"~hL!c4 ݰ{5H07NhFQMk莔`Yҙ^=}'Yu(n#R=p (T`Qטݽ9Su:\6Lx9}*F'%c-K_>2?7| s{݄wO~L>U}0q\H€ko/ ;UVebx@e^(M tE'ҥ+>ß˗7q#޻LGGɃ(=.X0F㣤? D U>9}6d8+qadwQ9TsU ~qGQAAd~\ٻLBJol&pWBl5 +췌)y2OqV&qT;إ^u%k{fC33DhuKRx0AQ2# .*=n:%phxUs-䥝 ipOG`Bk6PVېfϚyh^0> g|QQ؏anٝ M`uim)-~q:rDD: @Q&JYr*ft&o]J|ݚsPiɹ<L '8'VFqNN-2-ys^V~\iL.OzG訟ۣJ1kI'k-quq ‡q> rxp=hi?l}[]j- 9U; !T]N+,XЮ߇D(=P>^]J+qB&8۸ڥBתVu"knO"޾e2o8>Wox \qRfD_/Rl]wCm8BFָY Vܽ.fYnGwAŮG>/~{%F3BUg΀`Iyg2g\&9jd V7nq-ITB;-e0ФهP*2#o <eiћp~%ZqZ݁R۔ uBh &Z"XH!mf6zچAt֠´KkPҡ5IK+*FMh% z3ީ0VCz30 BN̯氚Я;h.JnDnK;SSm{! NEŔJ%i@(by2Q7vDk-ꒉw(Vb :$ \hZiXPLתsSh0~ Fh~0㐅?d]9QH,ew: nX kH\ƥ.*=*r8`4nOY4}2蝫]/)xa ΍j\*45Fyysˀ |0|$8JEg_Ŧcxr@ N -sZ)fbo FJ?V f[aCc0/6 EXp;Ӯ y"Q/6xAE<=9k5ܗ ;iţŕ, +à zݔ;\bxb;CK< nX-^=5t<L ½0n{ֈO\cg4١-6,jcvghp5̬FH\\:8 _[ ',Y2,U31IVDKLxmoVH /&w?)_3[,P)=Q{Nye!,"kY6]fb aɀkۃ*#Ay1:_7;mF0%I1Bup ٝ.D Wt! ƏQv@úW׳xʍ>OG¾;N5HFyg[J)(KiE FjX {  aӞib}],ZzXEɖ}խ[啲N58_V#[Â^1ږF~.[eZE{ &P%_ KNh*RWqP^n(qBVZVb<hKܡ>x W'j oEXXEQ&ڸd4d4aP\qzdsudjvd&t?LIwKR֕_VV$0;'WTM.@T \ UIOZ\,*>>y:HMߌY+*j5e=*qi0_7:ͯ@`s:5{DK:kPI d5w%϶ 3/gOo]4%tWAi t4\Ls1?.aCI:Cן&__ }a8ߴL߽y2>{=ĞJޚ'?tM/<ĤÔmuz W?%{0%E˗ŇbtU9&lMU@v3<~.TM*GNE*e}g#TrAG_Yϗiɾd"hxu:4з!=UE$ W|!9g]5N!Bk` 0ר(tX4nSfP޿=Y^8=1QPBrəpqmv5Ja1Gii4, &Ij 5y}ek&oz61b_Dgm 2b˻lGڷcZ1\ڴ7ۖ6`ꠤB*#2Q1Ms0#v_5dZ?ҼJ+)Ed|`0^Xn:J aRG,\Z5h%ƕ>bniFMqAebG{d @ tP #1JxE8xh$:ƔXib!ok;Ha-,_$`"bGd"#S jh* _L4'wiςK q!jA (E3>Ah7jEhz|Ʀ(~i]p-;ԃ Uw\[ θ`.yN4`^# Djbk޵~r"+Au'9"CP`FTaN F"%kbX?xM:}oyk  CIwq&՘uHU9k . -+RS׎}UzbJ pa}U.=$5ﷻ`l9zٲIM1֯;[vۻŊbPeXp z̢ b&r!%[Ti 溿p=6|'1GZG 7*+IjFbzSYF૟0Ip*@ZE@Qrr%N;< 2$QD6J ât1riAyMøGoc]JYe<`1/<ںBȝLDCVhF`S 76rl- X#Tz}7G]:&V7as`J `# P::ɼpB;$P#5*Hlْ)p741G0 $ L@\fѫht.YXFqt{br %vd\>j`RŲ()hтhx/o(5,_Mf ߲Be;^Eh$a ,/ T D4qoM P:2J3+y\Ӝ'&3LyƸHNJx E^b2uhCO`"=aOR{I1!I ))"`TkBQ Vma,N<ީ$Q~zgYNdz8Qtfx2Jd-7 &kGgU*{ nW@" P0qQ >ً4XOqGu JedAҨVb".pL܀kJ{^v+XiC$TsSd %2e"0yjrw.2 o?DɾYb$EHc)eLTQ' O Q\")( axME*[bDŽFL& M0)!dLhA Ej ID2C"f9rhUY&T.U HH$H.TDL$Be83A)rk|/5h|?qE#؅kqǥʚ,UV֫d;+RWriDq~ !5%t|^pH b TV2yQ#Jb\3c}G<9?qe`w3rkԘuWZQߩ+dfm2ytțί/­u{_zmumhEr^腤87tvWGL68x ,PC~G!-g(IvLj ;!ˣqÀQxԟW8u(GLEDs~ږWx`zNNš5 TvKL)uxz x}/q5!-gAK'ф5S eq W!emF_˝2KYL3exVT楹.ǣ {|Rwu·Rߖ_FlEl|?,5a0cD*mڼV1Y'{fLnDzBndQdrT\QPe$,wenN5 g)fFHbnMH TV義L2h~޾yr6RZ EjP}L^>gs>d5ss=d`Y];2zF>;ofzN襒)Vf^xRuVi#Ea#R6 8i6[+;VZd+=nDz<\И1ߘ|U:7E{apX x/Xi$)2Ly@#}/7O_wn0d.|x{0>|l)̗}k:^u1}kd; |/ޯ#M./»^t_JOt2Ɖf0U,ъd\TabXaUpcyM 'zB*Pcą(Z@wQTB'$zK^VJ8/~!@rF`79٩ØہLf؞Jt]=9>%c.|zLo,g}[Q)z9*Fӎ'?t9~;>!{ƤMUjpz=ca4F.dxw^@7Tϭ'0{$O ~%iAj-yE,vE%xf Wr1|YDT !u;bG K^-#'%`Q{e|8 c5=K[rɰ_`,5]j_7{ngevev2}l4Nr#ka׋NSA 0cĈdr{q/*+luLw0=o$9ҕ$ U$X- X͡J⇪Bj \(:196SjTʩԺvquBqB-H9|'HU fmH^00{Ř˅KFHњ 0ŬvV$Z5{[7:ׂTknmh^9;8$>iF"\J A9e˩KT!\KLDK\ܚzŁ:cXv: w HK̞|L u6G"c AŧivPQ.Tf`d(ig!Ω!J"RTgR$8eVJQ/*2PҖ5Y..(R FF2m1)g)KOϒg5-B3+.iV9HsXE)̚*ee2O$A@Iv9~FSJ<{VӾ:q*x'RDHc F@fb;s,nCHr׷V0p#ag?ޤ`@F' <_0Ck |0,\LskXup# PXil:u<4lS2DF\f3޹Wۍs3kS$6?u7/pS6jL{Sj8zմ7u,7;xeVV]4A)t;K6fAKeoVpnkޯrݯ9~a}El`Q-$^՘bރ4DS\?{F$EЧX&q^0{3;/`ȒObK-ْZqZ 8֣]X$:lL*kc6fwUoZ>"p Am! ![8.]t\3LCC_)v 6_1rN0Rh+MYY;ψ)CI1v*q9Wyk- J) f!kbWbD2z[oK]f#fr8&DR,QUx)~"-B%Uo-QXcǘQ4X_bµ;" 6WH/d{N'%t.wJp[z:kz[h zBQ<'h"I!S.X"W %an{iD,BB,A}{Qz*D4_#3r#/Ff"FFUJi%$[s0qc d5@HwXb%_fIas],i%>iYbmAޣ*/ "KjX,9ziE^s pzomj^x$s|B%t_bDb /-e C+3Z3XNŬ `oZs1'uro,h@FR"E27B#LjxĥEޜ7GqԌJG.  o4ll%F P(/E-:S2}hVTØ dl Kb :۱4].7MUH@(U++X?O!63;hߣ~G^G:Q-lqYߓc{rvY q+2ŕz [U;Ӧ:aTzcW_/3^= WnJY2ظ0ION(1F5B#žm3-"1Ca撂l!mVITJL 錙[LLbQ$ c1xx&1bu')fRV*MXjt@-:RO4ZVTJK@jl hd\2{1`LjNbGCjc۱Fp"]d)BP/3pࠤM$(k }3ª@pǾ6   e`I=Ci6%RDY$D<2,G"7h+1S,8/J1#iS"dr *g@qAXD!_H$:4tH;Xl/~CFP<2:C1YGE+;wajC#<'+Ua8 3 ~ e7oV\]j{[g`c(2iLAw.Y*hD kpHG_PRDR%r!0U+3RiiLRO&1-s̴L v1ÁVl;Gc,b4@HwA T%DžF֡)\U*] 3oE1?zN^ g#-kښͫG>4Yưg1-g D׵ qty\ g/_ -0q_}SOR  TyΘ9pUM_[˳\GOOa)Sz Bsi2di"@dY 2\rf?qR28jA0TB:*!Q:'BDvہ2m"Cέ߬g5$f!_ŃեT G1^Hwse&CGsWljBhTeM:͟kY(tZEπsq@|0(]-+ =HHg@a@X~QK e%pa䒣 -VSiy9w ),yV ښI(̄y9_8==4l>M?)H"~rgXg0!a2%]]^r ;ί8\޸d4OS01ܿ3r~hxWo|\/ S@IVNw-WͿv|a4#ZlBrn~)c LǤïEnLߟ)\DK_ =<"! r%!Hbm*Z)/ oV+#w]ĝ=&%=6qzldbOGwRas)]ں-zWn3<_wŅPke&V0|G*]іdMKxEp4[4snQ_Opa$".(!r>bBx 27C}0Z-YKt TtT}ܹ[}38D1:a~'ԏ>EnAۓ3GVFR cxi94t@y:J<{3)$=G͕0\SLkViS:x~^ 16)8 '>5rgZƜ߼NK.75@jaWpL]1"L”WJ{8o_(qtKʚޖ_mRJC;sB^W"PWH#*NKp96Ӫ~.+EhWa$ɦ/Ըsݥ~o4HbR +h"PcszeXƀ #ߐy:xK7{$@t$<[[9.XRͥ/en4]iz\6澡 W|YU>P~y-^Hf?0dj0ho'Q'Ck)ͫZs2R7hޔMd@+,ӝ^oy||gm>f-1r` ]OnD1l1xK4>w Ƅ˜Uh1b>V|(K+mYI~9UE}Y;iqۜXbRor?|cͳE/t>;`w'>~ǡ٪gpnmxK53,ef2z~r?h _ہxinYqKH0<| uIMLJZ(@R+9q}^ N8nhye8''z_L8Ξ{[g3\zGUQ|w==-mn W|s9Wopdy pY;Dh {t1̧ܳgJYK pW3sonRZh-IJKrܧii]oc:5rO\#ƭiBk{y; *wtޝϫln! V "bN VwGeUij S$[T HHYH A==yQXOM@/ ?ISZ4\nM.v}U1{ [IΘiCQ5?|oI향z{\T*qt)UӺՆ$_TmbA}=<|uhm!MľKsb~hMN($S"VA:],TTeS~&)# |NG@ CX=[8x ^#y`%GoI廕Z^+- 44Drf]@ҠB 2UiSg{æAKž woɭ8Ō${ R1exG蔕>RIMqečP|9jm6[[ɮ02Xjͳ>0N=>>+\%Zij[3ģ2!:*|KTƺSi9hvٻde㢇j}fM/˺V2|3y)ճ]*J_=|y?l"?pݹN#m+&u`bU[ { ψm;s8=|JH1-'ji',n>t zM H{M0u!r s#G Mp'=Պw@j`S[U{W.}D2E +j_I)gDfܓF|&1.GP؍W0=nЂ} j:CZz:ªk(,ܡB l(jCaN=ˆ ΍J?^m#2FZ`@`=v$׌ 4k3n8 F={#(j[#.JVRcS๑np-Ip$=g3BUUN9zItA<*>]E<7HkaTF?PX=N\8MhgOx t#SK8\ C~s8\p3yᐧޠ^U FL5y\3RņQ*P ~)1 wk]U| ҳb*G scr/kPq\'oouTԓŏ,?ڻD]:YU'T?e=P] u2,9gcz_MWZ6Mn/w8–OkB0?R)꫓ν9ah+Z\?UU?? ޻&?^޽O:뮁o}YOOϣ1Aċ,c|ӣeBZ*]7%K&f.u>N/?;pg+cu&I~>c]yV/v2_d1U k^}}Ix7zZl0|F2%E +W(M8 i ?#&#&񶇘@X]P+h?.rwB(|Ŏ+}\\b+[2i+Wx ԂLAhZm=H^H|dw^O/gR@?Rn]{G&|v4=Kkk^>';M R_؏S 2S&aι# `}aL %s_V:K[wsE`0K"p$e&a$g ȗNmkBV8'd"wRӥR)pMQxQIHJe#$}f8jZ]!ic45AclDIQF+1xϴIFA ,R_fȔ Z`HbBrm BTa,Ȑ7@03&VOXS+K1f"#ΥqZ"6[@ uQ9H~gˋS!3kģi hyK!Hu|bLgp\ UE2,"5O>\Rc$zJmVh +猢5j W%FG@술;|T99 )uhRKōSt`5 jΒ񥄆( :e!C2cI+(bVdJ ә︰/$r``"}Z QC;,1*.;2hτ]R0Yg_|DVNkN)w V1E8(QFl>0F3ZSC肷 b;-*.fj`J/CN QtUS%nfA2ȃX$ڰ„1 CB֝A Z kJ rEc$3:E SNi: S*!0mXE 6ZeRN>'DPқ0*BywAΚ]-x XE%Drh*I0ϊ$$wiBKk@k+%]Y%dP2R y*" xLl !_ȱTř3Y Ein#z]B$ E:P!9$8#2F0ʂ. @(Z"uE.h)U,R%,6 !ZYީALf]F e4`誃C&@tV'm<QOiD`IB k[&BA1.t3S[ Y TȝJV2vRl :RAAOe@BjP]v\1̮.%iԅ$1c҈.LBzWóD 6( UF] e:$ZYv6#VZ(_ ϴ(A ⬮_vhWR K+fE:8'fˆ)}3 Jxh,Vwum%G>$H0`>@bAF%Qi'`{N]ʲd%q@L<)tSu[ ުn]-XM^ L}6=DV.>&%@uP</Xźcx¶VA;ƏǗۧ7G]|di*SM[е 2"1wPڧy{sP6DRD :.%pvHLEQX1zY3m>%TmIJv 䁀:W!Ǹv%m!^5#'$˽eg;VQrB"rS:fGH2 tJ35VEɍJL T "Q޸,65\F? {a!dUR(3A78ˤB A16%VK|p?X-wH:&&)ѠTf%ֲ@T+u譪U}+XHH{-w$a$͢>Uj%KP`k, 9u|7ZtY/vul2IlAwH7llrlfѓE*=tPIH57Lk*F%8O)'ٮ Ę=@y}0W#[UfĞL:=nX(!/[trM1WdsC<܈XQD{E,WRc,w[zP댂pɔg':RC=Q7I1Ʈ0XB|bDtvMknk!GX߁z7( bp 1R -F:JX{۳w:H}[`rOUʦT6Qc@sZ@[-XYv&O ^TAlZ՞KЦ3u"1KCbgj)օ'е3F=k7!|Wfz-fC` o (|U9PV&̀| =pe2\HMO34%0U2'֞f+JOeKنkW\I2[1xxA4Ts\,*wn(ˡ옕ZI6 IشPb}'L\- 5&U~B:+jp N)T1fl~{ٗr 8 ,iѵ vr MÝO܀bX5r?քqZ`ˍ%FRl6-)P{˧廛әOnvmN>\|r{ #%Yj+m5Fb_gQ.߯hOg.W} wNNW%FˤԜ:32n01!:@y30:c?:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:èu C8ɨ`J5:=ڍ`a7:E nQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguQguYNМ:HJǨ&ƨmkFqu~Fè3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:è3:ߌQ˝PM|毋.e'q} Z]m:;56>i[Lrׯ3rnL;;`nQZǻ gV>'1Ⱥ]|i[-WEi}ߝj]XMe7ߖ z'ПSSϞZfIHAٙzF>!5j&`cD=lz7VGt XcXO`ۇ)ts]+wE/V/.RpL1A}(gV3r +\hZ7. }>Dҝ؁I?_].^監@ AA_t>Y7G _}]L/.:T2y&1]b6v'X/sEZH_&fԨ~R|RZ{OAkD2*O.w=׷(lLU%,hTJt0D+dL`zӶm6ɋ?|UoQ:e# XVU\U5*ܿ(\>5(<9twO#陀ʺELZk% Hqq&`YLsaY XrsT`?0*:uvN =w? XuMXTq.a@mf5晀u}{sw^#ẙ  uEՁFd0\j XLZVy X6\c Xo 8x'H+d U'JN03 fֈAz.a`f< X6\ʪ0xX@|0;'JT3-j8oO~XJa&`Rd=ld. Y# 뢚֛n Ay&` L1P,6v.`4۹Ĭ#讹dw*طEme]8+E'j Z?D%=Q(hb XMv+g XKf3rDK2Ew]OX Xv6V>l̇-iGt_;Ln&` x`յ;X+LF Xfևu<Z7&vͅ9lx9-,xs'⻃5Ny;VCLL,h ` M۽?DŃx % OA_xZz^cw㗡ԟbvrsԂAe[I.c{pnǿ?snz-Sex֖^o3^3^IsnWOgo\u9xxG@y@PC{uPw{m̽ὯtXߥAlϗ%?6 \RS : Ag9Lwnb 8mMߕR9[6دqpl?]'4ljohjS-q/6YŏW._ɪh5,9U 'Oiٲ!=鞺5rS`w"Q^~lnB BQw_Kͺ#oG0L#mTȗBir`r{!&]-jb |SZXúuh;gvPT5 (%ʭ2Tb]vURl bʳեcMmltekD,I DqI?xiecwՓk.kSPU*& +y:ٰ֞ j# m "HA_;wMBAQbT6LC RamlNe.S\ jX&EfC2!խl\3"ukeAQAQcS:n3Jt*T[!z@łd2*zb:V,vrS*R uWh%W&q2Fnͷ@( @BHPED&T4g?u>#Zzn2֜An5 o>L֎ AC\oS/%JC%:c TO v+fdcB8)VZ h2әY)(J892rVj ~})Rl eBRFA婺r{\_]koIv+I-a`l<nv ƃA"e T7Ň$ʤՙYj6OUݺܪ:U=O@J"Z8PGy>oI~fIB~NiNhH*|^jt$ SPehD5Af5b ZSQhf"<(-x]Ԣ]Hl1Ea&!S;t'J'QtOLv\ܗ*z|-y nS ϱO496` ȘG%Vx Jq-ԩ ن3HyBS z3X|E a*4H2Ӣ@e4M|vHg7m_?{ ve5qnny@qkDn.=Y+J`$daw$46!!e) DXf̂v-`R=wq2$6>PS;g/:!ށ]R%\^5A`.p_zS`絬׀R"D$e`Oy0# V;Vh{!;FAJx2A=ƣ\3i3V3J bN^FKuD΃ 9 RD xd &N;|nt$@fJ 6MABQp;H("r},2fJÌ& }0( R$cM:l"H5l?tz,#{YTgʜQpF7|d2OEEyLz#@k@ #Ly6n8QBD#ybHem-jWjnq:0VK{~qLNqZ\3'9:P` RkP7YKƝf03 HaR˄juVg,eupHEȪGCmG d=@y5֐f1c8)^"&jP.>"# EMQ=eIG .Itwz@ L=R,X(- 5L) ԃ<V"oɰX A>O5"0EELYk*'' ѿD VQ ȅp>P0Ӣ$> <*'#6Z+t 9i_O\#xԆx@;Z 19s8蹞L-@JX{z`6Qk& e)ϙ$M` 6$d <OM\'k~{)~ǠLkf,C.O(%6ՑUr6% H3HH-k8EW:H3)=<ݘ#KXܓxD |2q˼ DŽ*hIn|[" v3X-j6WZ4åf`e|@tDrI| nBPq18@BL9w0]% ( 2DUaz+/~?'˵~2Kaޓј*$:rHe,}Erz+w~EcJ){t}~$4ž89(Ml29E:0l8zh4H~rw~3~Lgxlv9}{zJ9uGߢ$4?;JIqp9Xm^ 8Na%J4uXdOBK 2^,:(B?PES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:ES:G(7=dFZЂӼvNFrZq:"")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")B")BW+ԡOBּ?B59Qh ^Pg"# uNNNNNNNNNNNNNNNNNNNNNNN΃o uf?8+Asiu|KH)Ne~=u76zv4kҠiW_M.a_5f}g YFC %;*X%XuBLVn`?a6oo)AULFi]̅Ә]`ÎnVh6d0dd`z}qRs f`h֞T9WpO<ۗ. XФ'`&7 }YǥzKH5SƼVҼօ<Nw,np<.yrǦar2>Jubr@=_g_7:tQZ^YR88^NggwWnz?\8{]7=~&u=$4 nU= -W0\ ϴd@Ġ_>y<ͦG>|]sUsXO"yGOr-,3= I X:2J)ZRFAb6/w?70[5'nhORC=< ؼ鞀Yk{V, ج'`4=kLVlu-ZD0%$ ,Ѻ'`ъ&3ڞ(ٗ&5/fGDؾtu ւʣd;=e"J,(wJ+bjBT1MOfiKkKZkqGd,=y-? XF8(`9!{2)U *zVT_jy_Z@h m s1jzm, Xhb#/;˾wkcg ۃ512%:[oʴ=K)gGۃeT#`ۃ[wnJEXOJԑ3*`5VjĖ`I{HFO Zӗ8w>`T9dԬ{ =Zq4TRM ;W;R\^b(z_ fP;:4կ.va_="F=oGW#oSvDksn(WG_QBlu{o99Mo˾@ߟ`)xoUG]b⵻RwGtW_]0'7*e/*=FUFN+Jnp#%; GvW;U]톖]reU+]=i>f4yKqprћ/Đ 0hy/:mt R]ɇn5}f0E?^3~.\+`\uuzztL$ u=_7GPIiV}}r3V 6 5;2kqLġ4LgxMm*o%fWѧm~~W]ZO P}_מ03i+6˯cUWzP7։L]GHhtW2ѻK/WFz! >s鉷_n/)ͪ/Ѳ.Κl8ji"`O4lY8n7k\G&`g*}erA՝w cjlz$ W5Kj)` s2Y&db%u}޸!8t)|. Gܩ`+-Kyt|Z(Tz辳uA顛yCP.%Xd\].یi'ӕs*5 YʢBA1-'־ҞJd*8򨻘!ּ96j2 i3 Y"֤J8*b]1bc`BtH˩zE~H9uh/'{1QvAE'xLuoC?aEV|X\m%&aUD [NNria!@cJ$A3mF-H;8L V֪DN׺p^yZ91lopv/_;C2fWy!uxδ:լÄ.u_1B:Oq|0ȊEW6 'ޥޅ3f_Ajvb43 ˯ {d4h$w7/NNly4NW^!BzNp)FkK^&p_h"7h< kMI8OWT};G= <]6O-AfhC zߍv"`6$e^QsmH UX~Pw)֗+._2}|lY+9 H Q~@ªhfhRNPFP7Ij'.Ѱ*6;iYz / (~<kZ`iD!||?~˅^o+pd3'*4,'/t~+¦ۛ~jƽxX?&rK3i_x?h[ni > 0=?7%~@1u|~M^|s-ͻ;~:ggq7v}{y>Rewq+rlY:O>4KFdg|v_o~t.qB*u3ǤzwCi GcM:Zs`-xk"&DVzcb(O 8rei٩.L җ!0wQp.Ysy|==ڜ[-H\b:|&ᠷgil闗NyR< ֿJn<1"7mxsX{nlZKxiY&<][7 ^Tw9"-v/l{sQ"M+#Em`B00ؗY,e;3OӈX*U%m`2:RbVVf{em1LF#dP{<[}yi~Uddytft7C* oH~z2\7u(j[T0Sr_Σu_- ftgn?fpUPK>T}}4~V{1*.~>-77k g^!ƇWWxKLYzV J7U`Pސ ~Wpb}<=84+ޱ,K2eXӥ0%HV:ʴY#Jzqq3SCsiGJ{3 h@՟U5VKr?;5oL7ZmMߑq$f k$4 _JUCt~YJ+Vlm6'nWce䫇r%nܲun`%梤,k,*nI(7TzG1#0KL =S&X厹d ̏Zy 9EG {bzj);^{xܜ/K{|]Ad_M7\wS1V~`w+)uIuE>D$R*X4Uw秇kޯ toى+su;Y/a}c.b.rɃH`WsV\!e_D2NH Ns/yVdfՑZXuFjA!AۛYq ^)9գ?s,^cŰ|糽yo[xz%E+^uOĪ?Lk5cՎ|噃öWX7%ze`gv3c[~vn0bR#0ƅXI,+w.w=8`/@KheSEf)pw!I/U 7i/vӜ(瀖kPrqeoН*5^bͯS{Z(cXwy OVA&Dy-`QZ#|/arW ?”R:[-Wps98" ݴ0=k_H>\vEtQHQ6 `~UfZuQ?' ~4WQ3웜SzKOUonBTTHIBAϞCJњ ׉بãry_K1eWyf )S"ur$ MTK䙴بãry_4x93Fx{B19C kw2Zڑ5ύJ14T#REwW3DeE^!U*D۽&@"Tgc9m q}`ڎ~ o):?.~#%XpAJHMB *{=O7'5L0ժUם nZ-!e2pZeIyeRғ6 f` Q=58m3%~3TÇGIrH- +Zocx4T#r5o V*4DIu=(L3:z'HFőUO(+bVd .=e ˛*'Lh"cr q=U4!K~˂%s)|0W5p r:0ƸdôF̈EWj<'域,frG Wm2 0ψ0lCp-M"r;=FOg6W`VX".`gcxUGxoTybyO 6u΋BAuDW$a!Kp}r)qKvnty[y_dsf*Z)+d% NJ܄J>X̟^(Xkˑ"!AHX*GxvdsUc {jU4Nʔ RR9]$ϒqy\Z,Ѧ@ԮpE9%;lh4sm ER$BXg@Qy]yOFGpIx)J] Qu^I.!#kx P}UmXn !&k\{%2b8F6QKoԌyfyO5;1~~憑fT]E!Go?sԊ!T'Gs?3Sm`v x e+LPT<6W}«\ҧ'JL-UT%8iQr,GT9|JP 4[ Wyj̰Z(/?j (] Z"?cA:85GO{պE+?j?o3PQ5bp\DTY̲2"(0d_/Gw0}׋.(ufB<}3ь"P:JsȨ!7"r>)?gOlUj_2Jp1O6t#[kdBb ]z"mN_yOŎgL.mh_}hb-m}m^vkU{b8+UU-̉^|BHEZsp?9L%<2⋄9\J&ml\?B@2pANSZ-eҚzBw|Hw]2,H"$*k1M!-2Wf+PXbȾ|]C"@ 8 8EKJ Ju">GhΣ\IPׁ<Հ%L1%) :Hg̃̈́by_/J>s*1ΧjS2"0 U-˨2yU;Cu@H﨨)=d$/`2 `KDdBJ;ɜV<$cj yONG>X>$[#:ģMD"]&/*0ϗ`6?k|LUúLEOX yIxAc&] d- 3_ >)c &鰭MM;b|& ̸'DO;<ܙavs}^.`6Naß 5:WUQmaՕK.>):A4$-xs%,ºN`DF>l0Q x4OyP=rm >Tb&i Czo"cʎ>0<8x`Çtw6&7y?uhyF wT.R'g,ì? +1!wMdxVSogS`3 -@\K$ϭ+)DS{FfP̩0S ՘ ה`4}B7E,y" 16UeHUBcSm3<ՀLb{m& 72H2`R Z8%)z- ƿy7>-ןGTer[O %/5po1Sc2Q= uvRRFl6aiC;(J|8Ѳ3d|\ x20_0z_xyň-ߪKM.B6C׸n /<&} %3R!ܩW zPēh|xZ6XDsr;67C<ͼw€HaXPznnwcK6suU/÷ND̀`~5’hR,qp"WT:k"r?yg{jR5*y`?(uj>TnɨýQӎ (nޖ*m-{zJXCN +m`舵5n\zc6Tm?!ۈ gc$=b.n[[T<{4P؁J=j$|lay !ϦkYyf4i1R C%xezmd,5WE.ÕsT\ E);%e xG+_rr; *o+O88hFD8 nao29$!gPǻZ'Nff>_-W3p6[uu+'XxF$`qʼnW L>sPt=uTI1B'}P9L]P cIV/WP-8L%3RCѸǨI.tٔjcy OKF}g=vqwFb͇cYpJ'kI&9*Hqj+]txG_̳5w_}!rὪHRW42I`2A+ic>_Qϗw%9ufr>MdNNG܄/O̟BtO<&LgŅ32I9Z$#2͊ n# 5ѺGJ4FЌm#upzf@ +Diu:ލ^ĔVV_=]gi ILj%UÑܽ毦EG&?8$HFe.KSqcEDQǻ^uv?9#NJca{PUJFX!U~ka`Ʒ5 Z2=uD`3aJ#ՙ^K"`Ե$xW+ާ^Sm:(}Y0EPs*pIQ'9xW8?Ɋ<ΧjҦ/WÙ=zroxD$b0ᑿ-5h1J K=FP><,A9{,ZjC D 6~[/#ǩ5WcdOXXZ.#)1lILb,1 da3 ȋyЪV罽·F,T#Υ,iZfDz-e x.QǻY1wi/lE5Zݾ2赣 vGqf%Xԧ~#F}#xG?aH 8FM m;eP4>)3JAٯ`Y;f\Wv2?OtJ6w{Qv@2ٻRY\a*-o ENE& >ۦy vĝR[:&M9 Xc!ԜB#X, ;1*وݺ:}_:^~m1 bT%T_Q8OŔI=0$GJr=i"_lJrPC:1Vg A$-ՌQQǻǽbw4%*9ߜ4ƒKEumAp(8gRC,.BwE6ʊCH4JLdJ(xu}/=u+p=4!bl5K\(b#~/hp MNSgc#`=w$7jۏo?icTu9VV>'<[_5Wo>NWwYWq{ @*LB!-cq/W@ru+Go^=XrphT;Y(k|k+0Wa=;?6WV-bv\6 oE򿜐6R,i-!D^ i_vZSnh]N>WD+Ch $)V&KƳxkꛛ2d}~OpV n &r9L7rMnSU7\n?~ykQFhuwiCO@oE?TҴpVbIqt[s7ܜ=bO?vZonǛM/7q1yNYYnxRf2YݥF貋nl;܀c.={\/ J 1{E T{s1%]h{N N@ +G5 >1r|9j|'hi\}{yl1\lLj(.4fGÅ,%{lw(90E |?CFJl8EUXW_IR 2rAPM\"OCL ' lxWh ⎰jSc֖uD~GUm!<ЀcAXc?}~~lR:h1\M'~^E\K]?@ e.XSќؓNr$7Gmm@e#kpǜ4zi})cx0?5Y-5Ql%g_'r B= \Drx@<[WwenL W'RgسΓΨ\5tOh4;EK?ǵҒ, ?_n-ZmPڥ5iJʫ޷ߌ^IhYꢨThJezYvxH֥T}/A`oƷwbbj`sPUŵC%h}Bk}jH&rw}c 1dkQiZ~\t@2?Ċ_uע/..xuUS9ƏӧɬUqVN|{F\Vԑ#+H &Yr Q89,[#^v]owcR? >po7jVe2"8As5ِ4er^QV| h$3NZ,&VC`_3O =)ʇ2}uY-oxVD&ZAx"Z4b1䀜]14yT#j/PWh8yW= W k 1-0GVTkҥTIm:]H2-˔Q#=c&:sU[ ܓM+l zSʼnK3n\b .-w1ǧlpmXiUnkڠ˴ X}ֵB [_ ː]$ȓ{ЯF^,Ge}y*N4~0ڗ9^7q࿥LaCV6o/S{&Mʝ@8A4uENGnq};>Iy?]7]]-ss#z [7I5F>|&XjW{*7#flc#-ԢJ"?T("&YTB'%4;8bh9-z gg/g@wcC;g441ϩjfD§K[Ezh݅۾Sw恠r$ualƯ}P3/C+R*&}"X=a=޺9^Rn6~j #M_5,-G1SqF 9KH\jJt:)F{6(>QCf=*km~IFiaZ91g7? S[El \擨$-`SyO)>ߚ顯NqрܮX k1/̐#99SspA!6ė$1:+i!@3. .C5-9?ǧh6aJ;Ibk!fc,_=ͶEbӋur(CZm7 <q֌7HݪsM<1PlcbUd?`JkUdq'雭cBfN2ss8OLvuۈK]ʕ24#.Xヷ1F즼lA(He7#OSw?uv{IkUB.#TXЕīĴrU@F/.w^\&ձS0[,"q I=*0ͥ48\}8qdw_[t<3hqP2ZH#҅ i/Go 5M  :1,NV3Œvd W_~tE X9hc"kby DH%\d L#ӥD/ ̊8+92ԆHX]yp6 Jn_{_iH k ؃C%V…QIMUTg6vmqKAYg7>(N bhI FJ2"WR&snt`G((V()ke"A鲲$Ubtp O4]ғ'$@rb[pti.JIG N'tb˴C7+d+$ (s\vqCxQJ>)Q$Gm8QV&HH4qA tӞ =?7 YOIR_@%3&ȳǛNLDF~z´QqI,%E|!g %ϷD,7*,IYD{WfߌNw'2cCKj d"8PǏ٢:F]t96Cna.N-#u R&QwYi?aBݑ GSK_^v2KXxK`.3j)-elt,Цc3߆(`ؠ1 ֑)L̹ $D*ʈuJM@ڂ]bi"f QH*B?]28C(ezbKHN`u(|AKUzxz-VubGh 2uBY[p8/}\4DhŴb>USm21_'c1X۠1JuBrYun^x} Ւ3|(^|2}rKG* b}q,v/`KғYu4 pO.[u..Q^ؿr< b٦$6rz-ꔶ?:5x;䣼DI?+KBk)l.y*C)͊KQ'*w{Ƀ"0[PP| :⮈7(赽͂/-vR]w,掟<]=AnK kscstP[k1w3F"5xtv`Pm5Y>7. tҬ)x7%Wcn2fj1մX}y5zvEOw/Vْeo|7W7R( %-Xsrn14]*$渱X9oc]gn;ӏKw2X`qãլLJЏQ晗~y'}j8g1cǘ; ^~z4V{CDUR'syHI۞CEnr?M] IC|V]܅L )Y0J- Y./S|# ]xaZ}~fg| s7]obe:0Jh6֝z~{`\ꖴ(mG;6.!zG&4SMl*m)97Wr[n+f[ɊZ8\y89Cl_6(Q}9~ôa\0z2?L;.r 2I?p8"?5{v@_:8V3"%dsI V'1ƙPŹLb]fTMk6c4VfzRKQx?$ t_U[ {g21GU>=}?3ij?t%Ȋ5ݚ%~OsUz}1_wԠ1U'tbŊ:ֿ6*"STijbmk'݅' p'qIKKhۻ;+.$o da4EeX:,Kr$!X5=3ܸIhR3߆(`,E4 y~wq^7-E1m(RAcQO$â}c M(f EQX6h AQ%Zg,)ǸhuC4po)o?F]|A+*j 4I琠nC`XC&ǿΉa*Pŷ9ljM5$+hhMޚcyRΨ/e~('@:xh bHeᵣ@i1qK`?wm[bPh U\\Y )ғj%3 s3jP*Etx! dBj*vy+fGL*S43V+8.NTp,Wta@U|] K,Nv-*6gVic/fT~A:JϻrE%u:9 %[CA)KͳMيca*b^?WY_ H+ S3Z6ɉx!$KЯ?hayl'9|DKOgy0t;7C_(w L<.эoZ51 T*Px1.)hAtY&y s?>qaЖfq&uC?.]C? p1Hcf<\%D^*2"d=f8!Ia pQ#N+@9fvsԩMR*S9xdRcAɄIͿ{H2G%9]9 =Jkt!^C2$ 98߬t;|[0$Aۚ|j)NB4LWahpl]{1?:Er! IHv gFU\#F s,{8W _hll5#d4ݔRTdU'j| ֱĸ]lGncڼeX*; =lb"Dj֨P.i^i_b2DOx- u*tmg@D>CY:*Ȥ$]9C_F.<`*U6I>mYp@S ._7!4VVkQg^bS ^/N6D+`sv9xqFČQ|Q)0, ISRdYf`ˢ$*hKUb<&j >sx| b !,!u6[h3ÚpɉT+VB?F܃QueV:a?p+cƃɗKJ41E2Y֛R^Cۅ0(iŊhu9gp py~ O gfɲB"Bsxô)Ҵ' qjd??V< =Mr2*ض{xP:y0CZfSH #tr{A(ʥHJu>yl0D*.IdF4 >ӥWTBT1yd6הvrt9fUXpNNBRpAcJf|`4OJk7]|݉.+'4`$t_Xe 95TX8(Otg|-A-3}oIk81Tlvۥ6 q%9o#ܾbKWu{XeWݿi\~O+ov1JJ*܈OZF\$< (D2Cզ>Õ51w PRk# 𫄉B`?XO:  CWşs-uSB7(a̒O!SBݤyeAtiC&bbY[G 祏Y44<8):`kzIhfrs6:R9fI1jt?y$ty?.+}-@Y&>sk"ޏfYuw}tVn klSkRt4%Q%d)X2:47ܓ>Ѻ^.ߏk_-BOkqL?–.;wm/K2)\*G} TL0]F}^R3K޵|}`feq-c3FmJ1#n(>DEiAmXc4Ė+OˉgA暪AcY{ȑ+q_c.vp`&&dǚȒ#$WVRK"%;Nf,`=N&=".A7.وUAݍSBɅ;+lq7m^$Db2l%a.7GJ"~ Wz υE i%I|r7LY@6Cj5-r ̂bF?_şUˋcMfalX%Я%T64N;Ri7`vĞUXV?q kց { hC9x7YfBWϗ/BrY=+7@aq 461_W_Ӈ7&)iR4j뼜d$ZAR6ڋ̆ٴ~'bOP/YhR!q2 Ll&)1:c>23jvxCӝE$Y|Xl"*cx-CXYE*͇ _n4$0hf ~ɬ\&>"<\RlQ<&<۟1VW106ţYqcZ Z Wy\8gj/_6%IgT֍o𢆜J[@{[[\7\_.8d3GC h!ӻzn:އۯ3Fn<fm9wʱ5,ެ HZE/{3!/jQXz1ƲWDBlݱEx WQfwduѬauBƈM\ jvvuyIz/2A_B) L7bCAoA ~Ӑ &S#)Y9,BrՒ/{$I/F߯3Hc>ye1ja21U[0"CKm|wn5t?ɽK\@d$038K&bV@ϷRPixݡًMJPpr.@$cqkrXF O9cʬCzݸmrKNb7N!=hJI) _rBZc.zN,ٜkɨ\<ݣ0_C$dyFSo8VOV9)FEك-qS(=-l88S16 g0. nQNbPxWx2)уj:Z:㖭M+h:A@'aY8~!{ 9N Z8%}0]"Tݎd#HX1b#IiZlr[Y% I!xmj-/H,L]s)IB̕pMi&(̡JaF0^!_ R'`6޼|WwjܒLMɨ|E2ޣ82ʬ(1byLN"q>_TUIUI p8Su 嬯\I}4%;B]ݍޏSubQwr/tp$wQO!;uh, Sro$Z3AG:p?qb}Vh,[)h64ݨ#<4YT(S93tѐLwrslJ|P W[cÔF3<U=P5YRJi!2(jRߧ+蚄 $LNnB8L3=So~mk,?vT,!nGvYjfBPz2'0П$b{(Pd\$Ixs)})!2̍F$EbPЙ3\{JM8ߞ&PutBuc}+Pcʸ 3*%G'cLz;!V Kbb/~2p[0* ?(1s] 'xO;.χ>OW)M;)|q7 ue=z+q{[Lզv+qYk dY9m26F"#( ?pCۍ+B7.d/@WZxPFƒ^O`$ +x8xx=l ZJ}U»E4zTQ2:sP5~53?ST2Qn:U4W~qFv.2y(._|Mºm:G=HO 1=D"(pBgjgΫUJ&*&*8RXٍSg}sf [ :$G.9"BAf$ԟNWA )-q j"zܣIR˓#t2E+=q©D5fEL֡<kfCXAI-ĥRI(Թ;d%ċY_ɋ}s4$QQ$tvnX(sޮ5OxES"2B!څSPQ`ӣHj(ڲNn t94q$]Ir)g6'bP.4Lb5~svw_~{`@]]AhUN xn ՠ\)/IQ`lL5f) U%μbz;ʯj:8>sBטQ+34Bݜw8ƒ+%J-okzZ}NM. kPȀ,Ό),MOi# U&d`47!-p0L&ebSRe:P""P"x-ٖ:~XtԊk!9@c*ױbo0Ռ|PRgKi b 播dcDj8F ?\f!oNvI]JYk1,`C[$ ! ڲ|GWh4&Sf93?ߝ%[s+b5\k!<짽 HY5aF^x(bVRo"X #dG^ '`[O^+bwz\c;\xqDJLU*)h g\R OF7ۘire؆;y)B=ca!jѻuj J vk/\aRY[^DOp!F 1!eLXό ΅pQ39'2\x=qh}NS*0K= -/ ױs+\v8]'[X(.ֻP o 2xUW+Vݺȗu]rY:%j.2ٻ=`w7Nl=Z&ΐ^xˇqmepMd Y;aE:95hN̄=,݅i>x NfZxIFڀ)҂,Kr1sǢh.?>n叔I:h81nkgp+f2F|*>2D#C ȠΜ@DSԋ!Ðv:q%wuVBָ| k\ prfrc*\&`b2lZy|"gFlsk0߾SZ_TL>-(0}VN0ʭ6=U ݻ٭؂Bs)tKnŽٕrhJ]y8xu.櫧uq={^ 0 L8F1$g.(N7$<(CkOTF bn jWh֢}dYe/65XY@ž9, N8AeB~BYň(L 2rڪ\cB_ |ӆ+"x^ϟhh݌MAbr I<1`Gsy `deE˝Xo?텢9,JZ7^t524ޣ̮s)IB8 ̉!&(EKԖ"й䞂ܓK+}8N ڇ _z>HT.ch8qcM̩0WFE}Nz>iyq~^hvy;ѯ(:@]kv;m ܻ||~; i%60nu0[mMqT(^2 bw1O|Ǵ%w.\]#JE]3 Hp20o~rW[&CҋVjԊ[9RϏs mc7%,St}iZ+%̻!#  VAfz`G%HUzƺVTY灶 "l>,q&I0톟)Qv20_ӳs&g I*=f/fbOէ h0qSjI#MLsPcᯒ|*V2r"R`̰Gw" uL Nu)qx"̎4č= >I[` ?_p~%'VZ юrߏ7O|Ohsd5w %<ɘG\X M`9Cˍ/ ZjN;Hq))cO#G,)v!f{ 񎯝vŢaE>"/YxJcq$yפxц<"$9w|ü/N*x\-TFJ000;0_j{F};`kp|S'UކɄK?kp5ZLmqME @kxri C~i kD` _* Mu sil&,(=ࢦ s aϰLPro$Z3-vYG,:؛ˌfm@} 'f y$Y]Xj)[yYY/Bmb'UPP>cP+k6k¬fijR9yJiȰqIu\L$t%X ̾(}l~2jKE/hȧx"I 0ɿWױ]w}5uI_wwWi{qQ{JF;ٺ0e|uawr?=K{0cJu)oA`F ;㠌w\P$=^Ο^%׻ <k(Fw6YF㫧OhtXc~~cwL1\ Lv7zxgb; ;=s{5AP"+7իԷ9V24懜`q.Se* $=ӥ=/y[}cg &ο_s7\U!phMKS^˼P! :誴h>>V+*-]KWBheVI^JT\3ZHjKS7gv}Aى eahrKN!.A_]6F/pᖸh㯩Z64//Ii2h̬Fp/ht-u|Kb𜿕ElNfǩJ"4H*V"MC Qie Ab4O~+_e*;QhpϿi{ lldXuT ʑ!Iu1C2"xD>Ew3k07`m&" מƣ+#ate `q!*QuP]LMyOaPéDZ>6pIӯ˛_ƥȠCEC NEZb9R+Ǝj$|dqCI¨Xt+UqA3tA#}`-d)cda4n5pl}@~7=! acD`C'w1XMGep8.Ӣ [ :fqA6DHr8+h9&8׼}Q\U~L~j{!#,xYDbcĴ5PC>Z4G~'/6˘{I.bs>0GGGSp^١^zoc;[}j7 ,Iex {5*:TORNI^S"HfHYfprc$h,A?%PCQ㵱MR0cĻY%!;+R ֿ}7k6*|'ޏۥ?jF1wxyNj/ʨsDz;OSO>hxI Ht" B 8n1jd@ f%DL Ϣ24.w0WүU;*UJXAO¸OVWJ"*d% &"#mJ%_]5}IHCS5#M[Gh5<ꂣ0.p^5`TFR؃UzK: ^L6.=kb $F|[2M BOKL, Bѐ/m,#%˵!oCF$h9Ҟc~\u77)3S?Cnns#7y sg:xVƯ@,k I}8taY:Y>۪J0CD >&7GDpvVڝiU$+h(]ؙiR% 6?TYG,Q Ѱ`&Uea8(]e 4Z@h)٢B.5Q$ы& ͺOtsM?>Bu0(IbY=Cg.OX[ORAR"I;POQ\!ܔwm)z{v Ɏ {Y]:xT@Z!~"u HFs;PǠ2,.M)nFߍH  :GPֹ,P rT|9W9M|Yh.nfAeX?u~`hCtˠDVuZ|=t .V6ie؁W`6EI b%f]c(r5}YGz4D2$rҗv.Z`6?,i5~e\* YaLUV-wϔ֜0Fң{Jb޲Xz0YپЃ`!<{H0#[oQ%e0 N1>f -Z fiu: 'U;+?M(dhG~5(-sA HЏ )Mw6n]w m#矗)v9QX~4xPC#Eဌ,)ʭHo'J.NڿCߗj_oц/?Mlʎ`/V\vO hǣ1t}dI'~ ׏I|ߏ||4,4{"C1d1?=nM//(݁ o7~O#y"_}$̦Idž5|n1Avo߄PXN KolYlh|_}:8QFZX*8kքx]% sQq Θ ,x-umgeټV,}t8ٮNy[ٻ6,W~]~ v`lHQטcTH*'߯MI%Jݴ%[ LIŪSνNVy2,0ϮӃ˛Tm1߮ʪQX]IjvE\%l)zGeX`g5bV ^b9=~]>,Jۆ@^ThZ(X*-mjQz5ܴhU?|E(S N|A;t_U8Y}8ۂܖotteE72}F?Zsj %f7VmzQdj D0\X(]c}l åm[d$ dv1}Z7f9E> =VA2ֺ0lUػqpH_ Юݾ0f=.!\8Rr;Q恫$16 V6|ZZcF8:q{_SRv"ce^k?YW;:Jth![f^:GQ&YO$s~{,Qw%/(1th=@"<{óg_/>mCXܞbY,Q0juQίV[;fU+hoˋ[@wrV0=3u-wo c 8|}K V%3xC-༷TDY&1Z)ϑJgޡ?;b~o2\Btɏ'ܽ6 w/WM|PΨ2g3JC[-i&oOE%Xf̆:S9^Q>9)sob攁R0ֳR49 STTqG,Y׾]Gr2Ur?? e,nvaJk' Fn:_O'ճ e7<;"#{Y \頄W"jfΰ U9L6CȀ=YIg~9C8|qOS}QwthΙVЖ6 tKH;Jo `_59}a#_Z֭o-tvw:c-Ghmww^$4vzXydžnhl:NܛIs3M;w662޳>`dRܞX,94: ֛Q#e㵝YdR/bAPX^7֋uo^'u˟rr@,>u4T8W}]Ruwm= w˴[א)R*ÕQ*$n]wFv^\/ j aфwRtׁn5hy' MU#nuQ~TQ֝_4o^ڼ6Lj1[eUWPf!*S-5VW-Zơˋ?p۪U<qUOF\ ##]\(9W_2TZMe9ZY[36YM5g@ڞYp[`H/چiY%\)L,7 E#[A*f;PYK?/?~^ށi X-p<<#ɵI<ΘDKIsʚb2&WKZIJBd8INPg-9T0e\W ^TPjx}J0,кv@ǬK >7{pU|"q*>o3[~njZ:3cX8y6&^mS8h2'/G͍)i n2TdaR{߁zQ{N BZ)^޺sS|RD{XQ4~KV6g*GJs},<#dz3/!0,K?SiZB<eSi\*7Kq4nWnj4ѰR?rAhZINET?j4w^4wS&Lc65ǥ v 6u'u `1:isSF\Iehu4uuO8I}d foeEC]o]­w/.*D_vɄvwn/zY+7fyǠV,wvQ1F@mT Q@;}M^(5I&G4qKG-tTQ-=:r}hquBDXM`kT[K9*Q;u N~61'$=Ofé@+cp*P7 'E}Zي2jg *Fx6O1SL~EEKV?wPdSTx\ KQq)*/X\}(vgURemU[ 6M02cHc PPe-6I[t@y]MK:.$ͩKjy=.`r 7-WE|,"ej&`0Hܥ Շ#,;ä{`$0KWc" Vo)˭ֳ J*Ze@:(/ Hn1hI8)ltP|f|?njed)JR?&DŽ&8Ι49QextFV(N|+8u\b2rs_  }Ȇ䡜-R|9o͏4,pS,;4hE2k-4I΢6:"2"hHZM2@&q(.pN\$mL2YhȚDCǿQӅl FHj 1 2 bRX`\&jOH1`N!ZQ*2#ԐQ @m9W} 9߽ZY=JghY S=,)2QRՂ'AZb:!* f2f52( pX׷O I]Zg=@)YXXM$Ŏ)ӨAzFj]Alm+N.5 1}RAp1S!aU9ج sV2,?#&BjUBb;N F%VBHa`MC;4p r " YP&hKn͂,BFƄM!*Be y>oIU$J 4P 4t!u0 e9S`hD5f5t ZMJ(A4ېi2kbу.I$D̲PN6_DT6r$`'JA&X6$S"Faz,g6JKYSVA*p=8_K^gښ۸_aeBjWa3[IRI/nmsCR=3HɼV$GcDwl:]Zz+(qih(}tuU2sKr*=``6\qLF ANŗTXvaByPr m%6H1DN W$x+BVUذq/. D,/xT7 ꐰHp#닦>ON[(a`dNI.+IHN=1}V| x9T6`QDx_]b =N X@kޯb:EjC$&oPw@>bhlu*Q!4Kp Ў21, 9@g@!t9VU31c-FmF̓_8t(j29xSإ/, $frPIM](&Ȁ jw72Zdp% ~@XT ÜmB1xᰳA A1֕gZI; 0] #,ӄƢjTRpTf)T[TZ뫷ĽQEX+AT2 "Q}ON%T"j*Yu0q |%:AqyD8^.ƳiV&Az۷DPCV  G4g^F-FT`o) Miu$<m (GjFh `Dy1 =7f6k8(/7- 9Q%rCK6\rPDKhLT@=B*|Dz;[V6bXhlFTO/[Ȋ"D W}nB\ \1{ 9v)E^gm,Ts"@`Q:l:R1Qԫ7&P :7w>lTk8blעZ@ PTDy/5 =:;oJs6ľ)kX3rG?Սɿ _}ƣ]OVIt#ub ލ/T6޽Ӎ6ָ7 ~F~y9z7vWyS.S匍/F׬>L8cj{ }H]Yl/ \k_^>jjtޔ˟&5Ec*S C5c!&c`Y;Zd`Ȯe;=XDB$"H$D"! HDB$"H$D"! HDB$"H$D"! HDB$"H$D"! HDB$"H$D"! HDB$"H$D"!> ?Y3- ?\nXV4OhvI"!~]eD!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"<]B- [|:7φ)o"|yD9"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!u*QF :UgtNu\uRQK$p'uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Q:D!uCD"Qun|ǔ mɫi Zuv~0*S)y/h?'^6ψWqy6"!DcrL6yUdnt|sJ?ej#vNʻ ١ܣo!prv9aW>@f/FX~"'g';qzuJfeQ19u0Z*W{iG d*uB|]OVg]<.ߍ/W<m`zL7{F4ݷx~*=JJ.л|K`g?W륊-gk7\m(O[9A?[Y//CjJ](o|t8Jwic>]N|AujhE7m^c *A{f{za>~ua!йUӵ,M9/FPsx})> Q_>] $\++HnN C q_>Rg![V],Et>Z ͫPyׇ˨ w^iǿ7T$vg$I7P$(yy`˔C%Dz!}8|P^ziWI[r7DiA2fǂW ^ϲG;gy+$>V9!)ئx|QK73\t%O,hY5MsX>= (~ߎ7O8 rޖ|9Y=~bkz9v[\Fc@׏-FoGi;\SJ._k}˱m+0n}g3՗7'?C͛ |C}g~iMgy49ut<߀:5]*.-?EvvW}<ޡWkJvb~u?2c]xs{9 /Uːx9b|Ѭǻ.]hlH_Ki^/C.ǫ CK/-ymG! X}x30R^:V,#!y~3ĭAn:s7|Z_-vεp7KOߣ=Zz˂e3wx&9&99ʚHi:fq= 2Fju8is&VHaTgYcіy'mP OC(v+3TMLu.Ԧd10;:[jƳ ÖE5KBIǔ?sxHEk 1@66̩2]kt|GC(/`_+й) *Z,*36Z\Dr=B kbsz]GO9}Oܻ5ia݅tU9pB1jJ:n }V5Y+SA;+|ŧNz4׋ߕumIzێO޳6W|VӒHh܇ݹ/zdmTNNUtНNWɴHIQu<)yk}UȓE_4kM4nԳd}mSyIZCJhXLF1 -gSc":=ykou i [J2NqnKkWaa!Z kZs 1-mVz@hѻ>@h}֧#rJcպ}8U59GxU.Add漱d!Q 1籷!+씾?Z3Bz"< Œ} ]n 7.TY~XN_PeANf~>䙙Jb J O).KJ J-[A!$ DYW&ĞC @,D:/șTZC @j|T\,{@ez +!Dkw){kѼpHWe2iUke=M@h\с< EWTJRXh{ۆ@h5:5rYkI\ZĴQ0_КHT=!Z O>)o,{r&ޞ//:+Ͽ{ۢ,w1H9u3XYrҹXk|Ƶ켼X bx钰6ZmY$TtQtAJޅj^;%yNJWO?bzV~;c] 's} M1.2.KWfX[ L(VQUـ]ɗg; S"r :2^٠bp.inX ¤,}a8֦*5}ObP:Hu#'tp^v0g>zn<}z_5m_;h ~SMXp岮)n3ԯ dHv] XA2^gБt}<]&:]&`RB(xQ*T ]hD%"gE`4p(c]F;ѹu'L{ $ۿv~;L>{5ׁs|~1c8WT^Q3A&6[fSP dQ Ɍ6ؘW,Jd L;yto=lk^?eay<;jd֐A($w`) G%NEoS쪽s'65H?Z½ a{zT\V7H$7o9^S%PHTnUκoVCޜDO*<ӣ%9n{ҦZO w? Y\qS?5[!KYV㿼$fy͞2 9rpk:׮כt|]I:ԭ/T E_U6]'VD&˟6V6 .Qdnz=inC^kfBmNV3˫lҍ &du'j;uSHMhΉ8;фҫ&č/W6gKju)ҽogi󸎎1ۼd-F g%w5.V/Dc#,#]5gWs3h!kn߶R'CiX/&Q~s>(M,Jm;k?B5+Agꄍ3= "vK^#21x= Vc}>vq6$nOL{sW1|m>p+,_IdowS>,ղEvr,ȸ"qOpV.R9Whۃ{%iLzdsWnPl}9?(w_ۜo|Z3$}|6P%N>zzWB(m^#< Aܷ`: [_\,2 㝯tI>L}&S= ?̧,*bfv9 \%}1 5-SQs]**ZxF,1_&[< D-ޞt7z\Lz?E70/#(;χ;⭎U/7ɷsM>g+\)8f! Oə"$ RˤD%S|4O=/(ItU@+QKZZ:cc6vK6!\o۸Vveҧ\TC$ t\j*b*ښRÕ'$[9&$ɖ䤅/B_H sF(]3-jVj]|`_qOӋOǯDuX(H3OCd M4&#]2En]2:]b.<B8wg$aEicJ $S6Bm%SQ,mԴגRQGlMN.XFeɘsm Ԭ+HeRZ6F2g%F`a+ ;%mQRNZ@(/Pe;UDiTp;?xwӥip®=v-y, XC( xN^ax(ʧTܛzqf5 [mN<t;fNmQ{.إY1B.iXv8f4ÿf'r=@o^nRH#8w .-6>y{ Cx~)-\뽹LQoMϚ7CQ8i3fcwc͝6:{d5w[\\cfJiԈ-$Q(C`Efxb(dZUvQnKy:xqe 2֭֭ᳱPݎ% JGgŃP:aOx7]w*i(S5^g |5k#*Sj@֞OE]JU%Q]@ujNH] kw2NמuO޺Y!FuՕj͠ukZſ^\\N9+w0eZ7vY63g`ʈ9o-C䊅l= @!iIY1vH%9'Ԃ g2Gm=wR+)*?PE R6h[.L;UzpM:Û3),(clݻ':r^nԏk\,sT%c2&Jge29I77(8lVg[]ܪ|", ].|Q,$-J(Zgx S12ZX%i-EGP#o#C*?HaQ93hJ 9Pot$(YKN1it#IFP 6; "`8MңxI8]BޑK_)%q+W٘\6 /рE~p_v|-5Ճ*)Q23蒡5>|vJj[s "#罚G#YPr*s֭^!.@&vX288gh QdIAr!LBĤܖ0`P'į)KˑN_h;ǑHP8M! m޺z%WH2y2kqcY OY PF\DZzy$ь;Ee^Ɏm҇‰i CȖvYK@8hF閜 luHoAZ}崉d6;N;6qdF'N^:Kڸ=Qt8j!#Q3m9+utdI7F}b H [?M.1Yx$Qkin#91z?x{=LĬDƱ}6$,%@*;ˬ|\"Q)!zeT#)Lk"pёt6R2텥c*,ADCXTINgnrޢ7כ&\=Q((@)0hC= %dXiuP̳xV"g7f:b8>zz0N·Yިֵ{c gm/ӡ55e>mPr=T܎f8p4`IFIkҀ5(EWk\v)zm4&Hc -Z`'P OhCR].+HtH"V^GW>m|JD:Fx0 .ag( J7餰1PfUv' XF 2™9m2{4H=݉O$A ˽j!gͮIG]b4 lM&o:^;)}K>|MFyB4 |{n w:'Ndq5bf50.*$$_˹H;Ru( !F7 +/`(6@/xi{W_guίz09ƥ%*E1P)YH|BaaœbyV.$9;9[vba4[bާ$`B蟗{9W+ٹgU*U2ga~v#aq.$cqk$26$@i#;!oHN;kd4Hc,qsҥ$KS%JQ*d yl`3猞y-weI[Knfj QNr*&KP:MN"6=0} afq8x|42l8${ @UXhSiM@@X#!#%9Ja1?VRjc% *@gxǷY=ax=@5OgdH\A5| d jY6>o6?_zvϱ~jmR$vㄹsp M8a)5QٙW}'OCt%I|?=#jӱHHG$k2>[-GelNN)瑔V5k6^̹PovV%@NnLi<]2Hd %S9JjBH. :$Fύtq4ؘ[rbD2K6v"ktՊf__yI/+NOrȎFotV΀yf 3|be4 L*/<(jJ6rDŽ]a9 KB*LeJ8@YϢa {I#^!=\zNv Rdɘc8(x})aAq QFI{+fឺZf2iY$HZcJZ%BI<Bx[Q{^<;H[ r}JG_v> Mr_/!vo˵\[=ssFjΌPv M~"w.m:7qWmCγX)M[`4h4{z tϤ1Mf5!D0rDi?.>i`[\r|sou&LzJ=ecYsz.>yK^hzS dWYOd3e8Z"N(.Sqi8O ÍT9E3(Q]ԩ~Q3Vw>6J3JJe`skHd"tJDĂ)1ZqZrp. 1f]'x8C[GkߟKl @A=!$Gz/^(k @x#X{2~s.v{;0ª?DžIJlDP@TGŔji׏8x$:QO-O7 PI&/?ÿtHxDeUhCꔨF 5aq0NĘe9%UtE0n/6Øp͘),d?#l}@?~<5n @#`?Zc> n0m+F\.b6WK.g4tS?ϖv]Tw:7߮GgXFv)rsrswYa-?LO-J΍t[0dт m_Qx@q\f]$A2`,DYg.yI>[Ԫ[B`Lh'sHt&Y=SwJ5e:)eEMˍ#Iw  g?]Za-h~7V=5)D(rc*KJ'W *)׳A(9SVDsX ENrZB)/;,$5jxPm;{]껚"p(A1V8OkLp$%bj q1e$.D?vDeZ1Al"&$Y1AbX'-?Bivx9-B;6S{jڙ<^ zB}mr-[.qarFPo&mwmxZ^]VXщ`j>SdG@8A|t'>u{#՚!nC:p]w!t6tIr2&gƀ.&q"Os)0O9 TZnrcz{cvvnz!?#`>Gr'|*=Y@o}Z7q#:i#!h@\Gtp1]P QիQ6aѯ3K)pYU73]uU!Q_dP09ƥ%*E1P)YH:*<0*:9 828<6$=g3ǰtp?k&@[LS0hڣ`*poJ՗y{_@X%#|} Hn }] ."f~$\HxBcp%*3éѾ'-n&Jm㻛z}\ <43}?Yx*ʷoe&f]mEGXx2:)1D^? ,Lg6"FF&`< ΤN+,BRTgzJ_!t/0z1`af ;$σ^ehR&)NE7|NYTuwU9@2EefZ rbqr2)_>qޜ+n8SU=4)bj|&Fe٤Qi#( Ɓ\rY㴌,gwG TKD 䴳hO Kmm&H]Ҟ-U1[TmY[gYW@"yZ .V {+g5^Qikqk"k/wr.*nLR쇻A˽"BCʝr'd띐yV. qY@F)$KAġ"O~U6Lp۫ӝO|GU_ U&U I+#$hX0T٣߆s}?rW30;C4 q9 )ZaGuA|74i^76|m o/.t [hc&ijb NP*D,J@mrp@[*dk17Enx~}+ !A('J8/8HA0,q A CR<(~=Ish+D+&a߉Yo=꟟ȧw⓿vEڴ %TaO܅~i?i9I |!Lo7-38!~h+>LuhkۇVihS)[^_vۍ:Y0F.ÜO"gbuz3cztӁL:~MGΏ䄻@K+y>D.?zEBj]_yǝUkʞ|b]|vt ;6J9`AI$ 54rgu:% "!T7F+Bo'gw5a{~>>Â5br~m;g3zOʧ__] /&y-kTj}F!:2HD ORH!&*:F[=8 ! 4\rB3CMP3@l"*P+YZvƂ~p6q8D5jza~ǽC޴c 9ˆ -SM.>餈10nU' j4LQvAoWdiMM0ָ.B!1hm%:OP`d+m7m#? پp>Ӕv =$+%I)>8 H2JJJW@! )f˽37e   rA8ဉ$pxLnxJƐ3YCp vZ͝ׯ\4~8 .ŐdF$M&NHlJ\B(cV (jPYd_a3%WOLU p')g npMU2SɬmE%;0NTi 4Ì%H+E,!3{WL\!3SrDY*%sۼM0ypH\ZXb{P=\Ql \E1bN1u5zĊUJ1*:l.?]L]W6nEKj۰ 8 "-mظMU&mrd営{]n3ߛg{ttL:fEh#-!.-NaQkfS"51tNw ? ˄ݔ1JMIyl>[ #{$V-YvUgǾhDm݇EtREWU]B<ȋ68pڄ"/iLJ PD * E^;+*l^wNal?2m?GO8Q]ѱ(cgЏkcNYi9 +[0̿ۧ1#5Rjn<_ݾE& - @ )zy|ʡ@kgβ _};3j7@1DWj4O︓BD*4ؐQm Yed]Ll8!9hrr,TBG],ypJ(z`E]g_glꝫ(ٻY { Re[䂛n{` k  /bgq9v;K)eob3EdP`c ,s7W(%PZ4Wy+XIq1*٥+Vs*K zJ(I sS~9d0 CƂ,\As# 5s3yh}) 4! +r=-+TШ c}??fr1~zq8̩';x\L $jgzAY$V]9:Y,Rׅ\BB@+^ ">b)* i 8Bdin =CZgFۗ."^m 9;c,G\Bt~;O౿&P?slv/ l l7kۏbvCI*l~-ɸ[+I6ߵmAZ'BZ`K[GZ*⠇TH}I JPn*Lj\F}Teӟ>="*Sd{WB1DLĤy2HEɔ#Ь]>@`{ $8&&AeI O5Zks#VOmlM|!J/Z+˻ykkCLQ/E{Oe}8'H9(DS:X변rr)> YZ8{,e6} ;{XjoμKԬ™b2b? /ܡrаai%:bwa' kO1^x?Q0T9w؆s}dޙC}0i;C4q+ȋ'Z^GӤ~p}n ii*dX\aAm.ONU6F iM ]9C%QMn\Kl#޿ ͯ=F&QD@W 5]n[9#O3@歁z4xuTV$GRy$ٲm:&PI\ZxN@' K,ʇR5zj)z) If#gO19XHo4ڄ2z29h5sh.ge]uns/=͓T=!8Y~u}7hw=ڔȨ1fiHY0JN1[ϣKhHSTLG2 .#HFjG,"XXmf ue,T ϊ{[|/{.H;5\yqn|FiiT J35\G $J-dB@"A ҿ9h 7iQ#CQo74\8,kWr5Y^}xx|A9,##iv\o>d䇋J^w/Wlg/?AqYO>vFN˻{Ї7w_رSS35),eJ k9>[VXzO^%]=dr!}O*+K 1u跫ǟƇa~v;wWngr3Mpb J7㞧E==[7/U폁ݿ&A0la]?VY'ޯC\N#uvWm +|zz0jnI07 (س'H$CLttQ7Nwʳѽ[r ه}:p\6P}S2Z%!ZS8+>E "mp&V؉$=A?UmBF*H϶[b:CY@^ Fh' cb1Jq+`d҂hZS6EԒS PF VjI2-d ̊ILhļsh,2Dz8QQE QJtLy+GT3t²wX.t xGn 53E,}|t~mccGbGgGE>s\Ҍx"TDQK*; Wzh$y};q! ºiԷʐ 5Ƅ̽kȼӿ~{ t~U}Ÿ_^ѻѳi6rh0߈2{9bC)^JAe+˹[ U1XU1PZI);\+opnʜ՛ߨ5ʈճ pʎ}ضlC`ZڈӻJL+NY/}q:ƇDo !l"THLX9U$ TB1:S×إkU?- bIeթts!e.iS6"9   ^hv/l)ll";RrXLQ٩`fS+m^SidOGVjVWPeUeo Qj(g TL cpRTKK~6gOgKQhPR2SM\XNV{eVVg+[}asj˾T/'rx3]Oqg'L>ItIZ%H2D+&H>K U=8X&7!XƔ$mq6|6ӚspT9PWكs܃ww "g"Q ֿz/֠ޓݍ&^ҐeJ`,# Ag*ϩpD/cA}!*Tk!-B6$%+?x`!Gh02z2vfZwt=:%wVU7+' n똻d<v'+Ē=;uÄ'ٯn q7hwEyS{\wd"=؎Ҧ**ͤ *m%TQ^i8,&eR)yEUQcS#.&f8BO &cGА NBd\F9YEx)X -*^κ>6if'BhvxyX~ w|EGfC."MBP:( 6r9XdX*iЦvvX-s#N↸ jW/EmSMC`k@UJdIA$0 'nBN/8;z7ijQ=lTc}FҀB(Ifx.J|0!lR9 p9i Ёv/C r1J>?Ϛ$iF@6Dpխu PaT{z[rڍYj=2j>}zgz_ Nh~`Ms= N}X qI$_%d]ud<.UůŪOuN' _F2[] ƟZt:YoFo٬:-Al9utv>Λzn\)W߮A8.$}Ըfd`)S &< . ZaFYrb#,&>ξH!=9LIo jKqj<1bw>t@ "ΈRRD+Ss)ǽLi^WIso>?/_Af}*.ikgw$'8U6(WK??~7 k py\Ax /Ӗ`<'\t!oUm*h8k6p< 7DKI$u C.b+LcAD%8h]#9O"N _H L#P;:φ 6S ~'?E7Ʒ]ȰH%,.q6 a.n?<`I 6S!%3V@*›GxB4 M6Ed6:s(v.|[#7VIǸÁj> ALEo⷟"L˟O%]KTI@-Ҙ4ed/~~E~ӫQZt-8RGe_6L+xSha[ UDkb*ҾM_>Yժe>Zڻ)Ullu_z=?yp&zG2Uhn2ƍnw\8)I]ͬ0\ `O_>#DksdG_^sqO^~LEŵvgɲ v)vkeM LǤzFQ2F6 }!R6MW{y:擖LםwmB!ww֡ $\<]_<4:)>-R̠(G,L'B))d[lO4DcrϽWH8Jc2QC a <2Nl+H[ VJ*}Hʎ#oVK9[\\@Ȭ"D*CCpD`<\0H⚡}BAu/4`۳T;{:Q-kHmn)~#T(vUnWW%FCTk+b$ B*%/ApBp h}hTۏ* S>B\WbP0g4yzLo~,8K^_7ǻ䶷C폋:~Sߘq &v󎟪On?^O0) LK)rsr8}^dQ@3WߤU~Gfwȝ4Nݭ1D+؜j5E)Os;-rw%Ys.B,XE(,"Oxy~cZ]wNJ7-57l1 fNd6K24zEI[)ixtۥ ^l]u"m.ٶqF(ҺfbƑ fhqoț7]G:eC#b̦IjzOPNi;u۸nR54ן;vl Zu"":)k[.Eˉ7$ocfo7'P( O2"̬7kYDe JbO+̭prg4˝ā6DkONJ}6ضJ=\~;as.z]BjOBtj8p~Mkԙ.pC" f==pQ|x?xTNfpFA!*F.f7j%/]D>i܁\TqH"e-;aIΪMy-VoHA?HP>jAu w$i:m=ٴ|5 =֣vi<{כ޻՝i0}t")h0զ41`T{)B=VY^hFj14_s2YZIpU<)񚔀RȀR)yA.WJ  C(2AbA˥x`! e4iG8/jnzd&R"dkTPy4 w5J9#R$wECy4lt\PH/$ئô0+.YE? 6ȑ!_9glμrs,ܑ9*GG7w>cq^`ZF `? r{ԍ\d-c$p00!L[w00fW"h2E)) `$J<Xd3/r8k|80##a[~zGz,<gίy0:b,Kqp K a`U$i( oO)r^nb Di1CKks R2Ѭ/\pȽ3K'B._U*nXu"oO^s|S 6=u[rp논\Yp>~x=U=8t#rӣZKŵot*Ko[ :V_e_vw%9\)]Kvx."&"H]dxo9ftaX(n]^Rm/V,8Y6NOlƭa-}"mQsr-\-_!\l=7Ύӷ(˃fos+@ؓ#: uBhU O8+ `t"v-d7 Uhotj3Jolқ$r9&Z੧7IT7Q^]B"8GWW#X#g+tUR"kz=*'U"Jrv.*QPUcuz''ފbei<z_o囆9wCQ1Qt)񧚳y7N +uz8VJDZDj0Vc%8D*DybZ 8`8 % aO)DVagr ꥥ 8' (WP>h2}V$)ȿ&QuYR`u8i+*dת#pBBuA SXE^{b9Jr0K qwOK;MSa{.^JZAT0 dة>eonڼKLV4N)BX*"a)BX"Y$vVL tt>#'U"MUVSwRE* IT*JbP:f=jlu z4 No"ZKpBUT-UDKRET-UDKRET-UDKREMtoۮ vR7X3;8qflZi6h5C87s@{Z^|=gDD4xqLoڔô~ЛviwSq+-ףxkv08h/ihk t[vσpCqVqk6x[ZVl -,ّDd$' O,XOtAED!X=#I[ ՞"npcђ?d "*Es;P(2$^sǵb9)*%d KȀKxD}'Қ1HivR9|lwąk^BuESQ &ŀ:1"sƝMr'P`4i4Jb`3mmYg'?!+ FQeP@G&!J5+~08ۃϸs_ݍn";}[fh>r:'6͞wO o: B"* pB%ÚqjSrQV$(D*V!4:WJ# 2ٶZjG&He 1(TBYY`DaWׅ,)*r]Q,#Y*F&eZDž|#5/m|CD2 Ągzȕ_1|zKE0؇هs`yqb9g}-ٺ١,3nf,V y2ZȺzg6Hu$HE/S.bX84/E\+% IOTuXKyA#rMgˁowFږ.ZvuƟo PdfB*iLdBWfod <]ٗ?{21^Ndvjw*cr<ޏ)0 )E\$.2b2GwV:Ndoj%sɧÅ|gj+CJ ,z*0V{%{t<(R/A2jM `\RdFg˧<$7H2IckGdik[EOT\U7);?ܰQ1- NWNa\?FW;>9ȉ J2ZF1yB;I[0k1鵉KL32 C.L"#!E1Rc5D6b&nj,!,C,}:8-%ry*Vx6iV!*+.x8y%Ɓn ĂI ӑ ǞM\5gg$tتYDmeD(>H:iN@Ir!mؾi4i!55JⱾ/7Q;9Ylt{ĬV9VsJjQI-&:@gl2:PDgYL"p8zo435յθyBsdq^2lUj/'Xo N> :쎮/P[Eu[WKf A!vuu7o{ΛNl+}4eG- (wmxxwɊ=z^j}>\[ݞ{mxMW+¬_4ݝe[UEA6oIo-Q({7?7 X/ltnnX=-"P\<Wu՝g?T\K݁M6.T2l2Rf܏ 9P;+F))o;)3rIEZO\IIئK %YR[m .困HhJ%\[k:et`, c[OK;U#.'y=X4o->ŋŴd ~+gx}uʾ]`0-m_Hxnyz[,ٲ7MoNRo5jMA*}|6- H4,m8\TZN*Ygsize蓮k4)fS)f4t^}w~rn?bN*g ,Ґ{Xz{Q=X| ϋ{=/xܩ u:i':PF5m;SJ 'F<|WCHK^Iy2AX#"Y,c;^{!OS)<_mn\xRnTM'otcu;tA|ʢrx.oh!tuG2ͺъP!^z h3$՝u1wwNo}%uaى{1>9qdz AЧ୍(cVxI8o|+'RȆijԅoh49Kv%sgې֑PQPc.G4H%WT2$M.m]Aꤤ&4HtA'.O*p:&訜zHAٔLNG "i,0uKKR Ke܍ӛ$& ]p;4:hFF;2jkpM$\J~6jV)EYEfj/$DD QeXzV0yI) Ie#SLNx$^E桯[Ee)xMȅInR6nOkycD&=_oJXg(993 Ihv3AMSv&<.ūZ;L%r傰 sX"M9/[$DeM7aR`)yeU-6c39"⥺&l. & \&#U2Vge܎Rb ͌Bl uDClvW7AYy6w;!ftn4]}7>9ȉ J2ZF1yB;I[0k1鵉Ku/I3}{ 4 H0lJAQ؈36 t)Z~|XjVTڦY`xͽ DCA!fQ&A QEGU^FyUaV]Fq LȐ -0Ap$E=$RՎ&n{ؓEbO"V"6x23&۬ˈ r1>` 9=\A! |6ziZӴ״ְ(Ŏdڹbq[uocw@# L= 58YM0"\ѿj/J>AW}i( ,cLwg뼛r|ǿVEp̏IKiFzK$ʯ\z{qB~>ï`BuBCW{"sM"[C(m|o%M {|_/_[/:7X݋&7ӹ~?q1?Fu7q.JR jpu(x?rA~|7yv2Jvyr1@^^"$r{ϣoX!U; ܺ4O4Xv: !זDP3,9gZYL"p8zo4#4_V:@;X|yQT{~V}eAo˥> 쎮/*ڼgusmr-ʫݼuz{8ʺn ﳞ(0yPHloqww1a1YQϡiSu]SsS}r{?~ .~+f(x[GCK?ɃSS#4z$@ZU.w[eʙ)|s,1╠ f?w ZU@[9:AM@ j婚=9X 5&Wb裿(]/ xzrvx4*{=k}t| n|vd~s=e&tyg{Sɫ:V[M/i 1J2,nplZp@U\%i@N2x ))Ȫw>nޙ,oqXze9MU GN,Q}.妆_6 ׋^G/ilxY7 ݠWXM) CM'eǴaJ?{p;ŕE.6l>_J޶+: zz QnVOVne Yq_!vn3><1«=й-ˣ+2N`k"P7p"u Y5Ϩy8ϯRƳiU8pɴhqbn3t .)@#z*(, o u:gkS)-rlm~;s`7$:k}cY f,}ޗc[lJ ߠޱTP+{\BkQ _(G!~^ >Px6 [[dC>=JҔAڕD0A#u:1?\oO>,NU|R!iﮋ0]Uyq?s WrAZ7/1EO_7M?e0 $E1P A-><ԻgX?KuЦ~~jAro^{YD{4=sGB= |~*毨jףϥ۞4t:eeEzi擋~7t8$5 9DDd 8 k*>E"mp&֘-ɉOϠgT٥muG -ڞ ̘<j".B-QEJi; @S\qɘ""T\Cs1=Xg*u*$ޣ"%\Cs%'P`ɘ"b\)m+S "1bU`*ҚӢ5s.jW+.>P^eRmvO_HDLdcthO˚H bUPR)Էi% KQEAsT^hAjWadL< g!XzqL A,&L?wdJ!. /gscLf2Rs5hc~z?f5>&i.iO- 6 %ACRZklh--56ƆZcCklh 56Ɔ_T]vmx2G~si*"$X$,6@ Q[BԞgLhgAd.4L+89{S yQZ/wZ츍̘`!$;; c(t)*+jk}ay=8liOz{70 Mӛ^6R_胾)Ȫw6-# Wg2"rg~nsvD).pj[M/,4sf6ly<\nЫGVVDžOc0jJw "Z6__ d%o[==~@eyҬ\xtcfu~I1!d6|'o:wB;+n?;wsyt>G_:[3{Tb!ף-{z?T8S%+e<&f<=Eqǖ qw)^M>߿1~ٛnnI3=#}= ]m\^6VjcuK?V;=!rI(kT%QZFU%\]Kj ת  pru4Ā2 \uYCm#Ldxd#<:V2dmh_ۉzMC6WsKPG⒒IA@4[TEt%m1N%FUFZk}(7EKAz]P^qw1|ĩSbpW@kzWW41M#/.yj^sx3:osΧ3eˏ}87oI/~6 6Y>'KWqB@;*ؤEGLU7R>=J^ Eb0D&'([{ [~UuY~|gz1 cs6$@H26Lw)/IIkPecm죝lŽq~L"ؼS8Ыc~$e)uWĐM4Gen$ 98ˁYs۔vLh.q]v:K(v1Yo[|s4簷nqeY@.'B#Q3`#)#^I8 p4pJz! 2Xe(0C#:JY11iibļs^¸9kq-xk&"T$&1R5qQb,-'oH/GeV,un\]=48ůpa-⢍α##I>c\҈*@`"; Wzو@ =4$D@ )l $ x ĄRi(EHvY9b%7ISdr.E|16KJJ5Ƅ̽õTF VP y$xK[o}PPPz'ؠϑQLЧk"=7s]9gy$$CL@ou={?[vJϥьoJ{:yeumyմ!7?NhB*tN.Q.J-/S+༦RS)gbyޒt {]Z]h[sPrw㵻# 7geohB}m!'YE"cRhJ'÷N>} z+zQֆiwZu!\4zW1hSgې֑;֜ ?"w)lx? k! Ц)WEH"H KZD|Xx)t +ǻݧOHZͳ$h&2cpRT?l3JKUI(g) 2SM\XNZ-ǭ-Ն[ ?tdqj$D |׃}k雯{BˉV!yr5>.{vMǬa)Kٻ6nlW-~E66ŶEbl`Z\IN.3,z4d Ա5y9|xoq8W\hɬT£ZF3;E%G)qrS!7.I`oLN;xb(UXj#Vgnk5ELh)^q*ybzϵ9Gi}ve˼sQ.-@ށW۰=𱔸(!hpm"p wL,RpJR^k\ߢaʖ_~n.^wpY:\:Һ.l+]Y ]=0KikRȤu)mpۭR2òGՈ JoB8exiWA;[ݣ_etz~#q!Tɺ3uG<{ߺZodw1hcT21Mo3THY"!H@mrp&TcAgndsW|#yQp"@*HrpdgIN %_\p}PF #?U*gF&G)h0 iܐN#t$FgQdTR9sp*C]{-!$X1bPTጤR؄ 426&jd\n ӌMBbbTܷu^~X3X^f^3.Pnrvъd)K *# $eA[wL;3f/QhC6 YM+Lth0hac"tӗ(g5b~/̐}AƴcS nQ`Jh|Hf^$CA!u̞; K;l6lv wZR -C*E{&DL{#b~`]&iY/ؘ~"A bRD;gA Mƭq {Bdknu{cED)! r(&)<`) kUT4C9/FY#DZiרyr}7y< m Lq+*[,z}7|՜tRtude'9o<'q]vB+rpsmA|2\D|ϲ&-SwrU*t)d!.$2%J쒴A|_N"tMOmW3P1  +6BzRӟ머3Z}/{| Ȉ9>|z#~BiѯO:Vɴ)Tr˫^x|Gfa][,AKQV^(+QpY'/ I:J-BMbW?.|#N߼{uw3\F^wp-Q96"Z߶A?y647ikNӲ^|v./hE6x2TцX/6=>cc)GUl6[6~J J3NN'T`lwI82(x #hz0;SwSI4Ĭ`*2)Dؔ7Dc>8uTƉ-{ݑ TkvH&7CE -.[/G Oń $#h9V0F~,~?}Z-E]_K&/jwF׃ްߗ@>0/rާQjcqll*ՠ7} ߽WO|0:FyWp( ejp8i \=NJ #p髧L"; BL2},WYZmPJ o2VAGnhQbh{LI:9Ɓq%Կߋ b߇n$AZ:!ҔBm1CcϘz/lq8*#:~grPigP|rUuWd7ƌXE1\]_+\1dOt7k7k%Nc|b*]u28b /Vjʅit5]MR&GMB)teN@Sr 0vu>-Y!o;%$Rr])Q!cSL6afxk37|?TRxbJ'EAԛ/(aQlXPlWq(;K+ľS,%;"ƕԎ ̄:J}(ppԺo"Uȃ+U},-\}p%  \eqg֞U` %-\};p\)DWf`ٲj-\}pQ gןNu(P2k5P!N^1"Q1s(629c>)$L Xoꢠ8ƗN6C`y!| v:\/g̲dAi#Ę 5/;C.> `w*=aWãqӣԛ }V7+BsPݻo(@6m[G.^焮14'6szĮ]gwbmweMh].8MnQ FGϥVRȝ "D)Q)ZB2M\x N˸küӨ㌉PYV.~6ca m;Eq8wߍ]wG[p 8>/lB{!6O){o&/^|/l/3?/nO'rK9iiן /&)xuBitRQF›IQgϰuϚOX"it6^G &D*ǭ@Xc!d(|b"*4b8 )iMQphjsDAg iL!d}`MO {>5{Mv|/T _>w!K6S~T'm.>7BG%c9$5ZJL;yckh{Sj?[dh!>ܱbg<7;7fsm?>{x`8;[ 89Rtۋ%m域? 04/$ѺK!!Ƃ.%xLehOqCNl$?b,tW4O^kDEA*!I\N8D`@"ӈ_66f2P\# >&  ``*QfE:ׁ(OE4l6&Κq}ת/7_|u):N&7ob5;OI߼jd}z>\!9!QB$&>)nbrQeD bpM-$;aV>Mfd5ej\ ..|W٣ӗ-EQN!l_Ƀ2Gnok껯ni{P=gfegSuuvo'Z^'H~uw#"k59_VQ>ȲwZ5j:;pܬ8hhe s2-c}3M-IDpj:8ʑbk֎.2akh-KV$Ҩl8`wvܺcAGBGZ7>`\°b>9-*Aƒ E lG|$/KC"b!":wAr<]#H~k !%!oJr 8M(rn+֑aYqGCܪ\$JC)llD^ >o2G@(h$Xa”5&O{ r EG[IK|TPWDi{°r_/asLi1'øT/Slݔ][y2yUyM_eZʄ2a^!umLjnDh:vFzړ:'.pW(!m3Uu7pU.ǺL*JѢ*7aD3J-d.([| h#l#l63W_w 0h</}ެ>}qr*$ q+AOVrHFN-hDG-{fb*"Pa4qFI)%4xc K!.4DKksiY录&XQr:qb- 䎍{SZ,r2HmؤP筗ZgA*7A8RApe YS#<ƌ!me`1VYDc,6ʌJ86`#*$;A Utdll8伆8$}d׺*1E[WD؁aBb*1gWuݻ"Ga ڡ*bmq ENkcKX"#Qd71 F!H$B8֞)Z%Y<ɻ* x$ )[XL"^ˈiDhnDdƝ2rl_ k<t۰`Cm)߼{B7օ[4{B,^b7Lf747=m6+t#kEbnLBk Ā?.&G~JO&wg$ߎ/u4Ѣof nq@ô;C ۛ^Lf{CY̳i|CM- sO#pjNz<:)6cU_v|VbPSJQդnṑ2KE%QWX@Tbdl2_+c gZ֮zENGǣ=ymvJ1X\\ZtP>bKFGx5ŀu !DŽ Oc*5(K)µi|Y5,s!8[fz~4Y}&$IU0C CVwKZZ40Κf9ۀA{ cX:6P)#/`zS1(0`sI9]*׻%Z%:]o<"#R[FǤ"Ufi%1`KAyG!SV(P܋U]cFJ[EZqrr%i]*jvyd#REAtwE_jjƐ{b%1(`  (("\ )Q8M-1KE!e6]H&[Eh{| 5q!LPT] Ꮍ$B$ JL0  !aOslr`ָtS2эq/ߥ̨\WSx?>OHJ>ގ/>̓$[ smu׽-c;(U|; C>]lp{!֓$=1nH{7 A`(|Ozg;^pZJ^k۳*:J aj#cХoGb'e3LV:V6JѬwR[oȥ6裏9%+tT;~`x='u\cH%<M\ן?_>_?9L#0 8D& !w-qiTm9m9߿- k_b7 Ж!= LBc1"x9ve̖T5PCuNW`y؟#2 'H !mA:ք:H<8aԧî1G̦#x1 5R0!($ENjRSi#j>&qbgN Q* &^rggG -єrdGQf_,~qY6+a+,#GN4!3AI q!Xm`}cٴHaޓ\XFBX?1uUq@D*& oa-ѷ0F[h o!(ѷa-ѷ0F[} oa-ѷ0F[} o oa-ѷ0FRoފ [ oa-ѷ0F8 U yNJC@= xZ??T[>;|ٌF.|s '0W{0$*axdNTaL]Ԓ DA LQ\gmgt U~lnGLi`k,u;bMٻFr$W:x0P,;-4ffb0re%`J%uMɒM\6L* #^ ݯpajt~Q~L:?^}Eͮ&Dtj<1T>? 2]ޙ{f@qףr>]x^R5F@y# o䍀7F@y#i䍀7F@y# oj7y# oYF@y.6F@y# oтq8צBܣv% ]L'(J!=θ5YzUcdy\:cŲWA:g R"A.7[r%Lv4JgwL|W0R}N ]OW\+x.R]w]z7$\0ރ:jQKnz}4W ޵@~׏f!lon 7=74wkZpMy9UǂzN;qtR=xCl0wMϚnpԦ[lOZYqq<K]ܚoL|gd5d4#Fʕ[3U P؝ }VW9ArA`yB]Mͅl5+'T}cȸsIQhS*nLG$=y1Nd- ZIDkKm 6OdY9^MYƀפ{v軾LS@ 1npLm?gAPOl ƤAOO~^ d⾣uvӜ C%T! GOV$B[_O>LB 6!rDb Lbb12^:1MUO"!C`RfG$!))Z92xZ)& Q$8O"&T=)V]}f[n ?]`wyiq$E7nu"ݕ1d )E!1;iYX`!jBqJ`My(%?}XO:d ьA%hbBͿY߈7HsSw2GNJ|ZO >NXr(CbXPY,Z ځyhI+h,wU;5-z7Eȕd&N^׽g@UY ZYzu; U]{;MnMpݗ_.laa[b`G(w"O  ʂ5:3u8 2 i3qU|@)3zI, >Rg=OGN;pA/wv'WP;p`)jkg3|AùY;wy}T,);Ȟ|ـ"̃W *uҿkVTRK?6SVƲlV ]KejF2zf2e$ +I&J-J{Ȋe %E)xNvq/O/`'z~ͬz}TBS.I=HQ»&wgf){!8ń2I ),N,Bi|H4Z k<QIiB~` "1gFH)<]vn]ܺϧj|&y Oc\2ó݊ bZ#kfW}Vmˤ'Oi]ϕeA]jte09I~(]F?'Ma8 OEibF 'VIw'?n3 9}pi#\CRËɴoGI*?=MZ`vc4)OGL7dI|ˀn2ѽK8wYogNN~f}}PŸYA\o C:mᴺd5aaDkۆGz1t~:s_a5:n{nnGW=/σNȕ'd2{jTmݛNMvy4?WbgU z-ye wxupy>TGF9;a1yE)86>A`20rRrْkaWGL:d`Єe.fXp%RT )+q>)7 f:C*$m{C?hdVgli >o[Hk|!w']Q-QK{xl3yjZ&CeMH]*E:={vDñ/rgb.5qD%& ZƌCqPZy@ST <~ؒt># GM0P!@\gra3v{AP6iBMHT(e|Xy4(5J0f$z5;C*X1ZSܲ?"\jVM;'݂FԈ)Ј S,wi=ZA_YBl\.ߨ9 Y0 IL %-'xG _|txYRL7n9;w{aZ`~W{,Ha,OIFH%X%hZ&G ^s%ɋTIHU U;'ȹFt% Y' 1B#ʁ\M zNh8+gIꝅvswvEC|C ʹt}(ix 3뤔2B3Ah5E?: ya"@z!"5=1X,d[E߲¥y[m_~Y߄L.co}\? +KgZbiW w V]^Wɻ<.E< t(&B"e%BrD*+1[: ţʠ%Y`@T̂j+R;oc&(7y;_w,4㝦'kBy.{5INp(!^9g2傋$lE8E(|nI00|+=ìРJ0 iOKQ#Q.pGM7<.2 9xwz7A<stU]M7aDx7xn^BG<(8,r,RI: -fˋnJ9 Ա<]FYfyc>3PidBpQ œQH)Ip!8*Is*+YZ("Wt;#_'^k2V xAXǀaJ*@1s"{ @E+%eoմ~gS ʺ -6p(`hנ fv}{ذW ^aċ'KoO'ݙqaLɶCzfm%[.d ³6v؟N<#񆉙0;r٧hˈs9kgl8j=\nV֣၇~q~v·mGg&m9n23i(`>op7lj٩ HD o(8cp6rmFe2A&*c2BYjʤ]N'6OF;׬W0<]R/ <1&BB9!#F:pkx)H,WA,bg"!_*YM6Uuk1Uĩ".<{[2܇tD5x/jV?hAgm ^B;HRD))#^Iw^]YӖ5 2`( y.hAGq @T"ɊIKS%06Xjee^ ڨ"Q⪘WFwՆB[Hn,4<[}Js\dz%vD_r6cNT3-3!nD$ Vn QOЌ?&R/'A"{,Hd Q"PmJ.I1!zTJfVR3G8&)x%\q+Lf_L͒,yՄ̽kȵ7 g7$zh Awx8 d"~ωW//?~4$r PeOH 1<[];s?nrSVw@z|Y`^NW`9*WC/k|ݼj>;:vՐhB*tN.Q.J-/ތS+༦RS)gb 9^oI8C ̮l9YP0Í v a`x(?"!u74ϡL DVFOȌH Kbp43(<upޢ\fskao?.o3RwI:QZ15Y)]"q5+-%٤ YP3~ܻwx?>|{F9D"rfxMBa!sQ39f 8e{3MrZ~ns״N#gW enx4| ƕ~ Z򲞽G`K rhh$P_dVQuq4I"CDH"H K1M&'Ǵ**kp8h4g_{S4r 剘¡xf7sTd%YWuN͸Dq9v̱cg36&DtTS)H)y`2Q E|\87)RΦd༷whuf"G2B,x=Wqg[pd1?ߌ`S܍ ;j{Lktg:YM2:r ڝ L.p''ݟr/ɬuL%8_9n/zo\ż祖塐ͯ%Kil;q]w\cW׋MwmK 6O) qՖ[O%!Lo3_ 0H^+mݘ\x ,M޼zMqq*J(M~Ҁ9;ߎPFx:' W=O4򰣝`C8c,RbVT mݥ8QH'76lgTLFX-h#]'~]Fƣv6qvMbEy eGw˽jPzUBy> ѐK%Si ZخIK0-x|.8*h Ԇ&׷al`kmH@$f~?KA\B?e)TlbR!)j(Qv CfXF$BI$ >V{ >&e(gJ[EZqrr%i=jvyd#RE2j಴?Ҵ" JbDQ5A3@P"q$8)maHNS3q8xFlMU@`85iatHW0!gJs( &)dzqh|)Sp!\2$,>6pr~q#M }PQX: s&!F%a P&cjJu+V_'/.'^΃%I<̹|6R}=s"O0{3Mn? 3Z%Ɨt ioFh^Xށ"0V6 f0bZɇtmMV NV ZtmU!qw SI sG.U}9,m.J)UZ378U?]~~1?]J3|5M(Zmᥝk[2Gfӯ0Gy1?umz%V UVlP?.{Po~ɫ?~~7>y ,#'/y X 2 .uA{wu@ޣ?m޴]5 Mm4r̈́oҮr-}]%ݝʄ(pFy+}hح1[S0ƂdJ1Pmݞ𓶍o#2ĽA &8L./)֝#]č0,#پ\v9շ[Q{~ tz ta` >PD6E2;8Y[FH(򥖜`zZMAY`~CQ~[  RQ0of0^MY5N07FI[:e)-д F#2:I287 N$cݛhUfu`ϨdUcy%5[QTL5IF# N'(>ӹ}y!0~G򆵰Qpx|"$рRFv^0݆CRt Ѐ]QQ?> ̂Hr ˽ ?:Tκ^Qi \">;B7QCDQ)TNMtG k`R=!B(R)2EPxtQ(֢tA(#LOp).}Ԗx#4sf"-cgYo*$c[[(;"i˄w~!j_(M?R~b##k&g*] d# jF,ʹ۔RFӐ=&$M* %^Fґ0 KD"rRuRv8s.XΤc[:*[`Wt yFE2j,Bʓ@k~Z(YNaBg 8l{Jq!|M>UTRD4+a<ס3%(‹WlV:}9,& K"'e8~ ս: yA?I/G~7?ܯJﶷӈ}zp@/ DgF z3]!|׀/|i~`*ϖihS&k/[.ԓ<女"xv]4]x[T}y;)ʏ'?W' L_iQHLC6G@fwrfx !-Id%(,XHp\r޲g&` %3*T$H"O)'\ɜBC,hީ ^FcڈSl!RSp#&ZH0cpH[;f{YJXj>_퓯,8t fF}Ηr.Ӭy3<( rILqb`. ̩rc6r:`1VYDc,6ʌJ86`#*LۂPPTug/Y_@+-gOVrkor&i+r @г=2gdeȜ32#l9#sȜ32Rp EȜc9#sȜ329#sȜ329#sȜ9#sȜ329#sȜ32j9#sȜ32g329#sȜbȜ329#sȜ32O##~<s*#֦d( NB1sT`OQ:7 nܓ(kψai:&45ʐ|̱c_e_nݝ凅U 6욪Uxם\ٵ}Ի$ T5A7 5PS.zxM2]=UPծ[5_^_yëAs%WCbhot5K7PϦ57Gv.[u{hN?-xv}W eʛeҭ|dW[7JfT.|ײ&¼nL<qKl,Q *12+ΏGXevB=!yvK?h.`f[-B 8#9Or)qF{k<.xNuPc1^tb@: CcBL1jveZ²SC_5]ou֝ C 39ߛ(̾nEv8%zrJXZ+Ơ< Y ȟm5KX|gUnm =&SaX:6P)#VJ0G)Y9$_ hfF`6ۍTxG)IE$s2Ja1%uZI%hDʠ;<V$,jWkWg./UHOQ:6s.!~ޜbrIΞ@CW7=efyh\(G׺maފ2 ̿29O>4wӁovmw U" q8jnoo{}߿?m¿̢ϩ;H|c~ތ_to~\kEKuͻZu;JkNBB4bs۸of7i#=Tۘ-éjHCC2yѣPr^W?k+ F\ayX?G/mɨV11뼲G$vQ6NJ) %qEotd Zܧydޞg &UZ^(8KVDq% R!1"%%TP8ؗbkm]Պtm`**ma~NEkPeIBY<1hHy[!x gADRb3ljqLE&sZRDa4R4s&G.2^} Aȱ9&+("[X 8qtއ ,&i*->^5D.QXy2,VAя'!@Ķz/Կ^yk/N/?'7"0uߍQ~.'q3r p~MDR;w>߈Jo"P.j$HYepcBYrV7YU^ /Nx!o#*@m* mmfb] } ׫RiQ @bLFW􊧌OfW=@KAO[e帵2Ik"'u3ߟ, FV)^юW&4juJJ|.;l k$>$PD2BJ }꥾a=ZXoB] }oE-:Y/PYF_>oi:x)1'Ic{K!pFxh̠Z! Z ҫRj;oޮ.tZo^.@hA.G[i{wLt$#W2l]G:gըmeڒ8 Z066a=m= $28,07NY0G2%W//ۙoixrxs]q?;+7,xkD% @( AAQ4D'i1Y zX'N콦yz5 xA ~'QD(QOr V1FW3YEäW!+"J/sO'^% ),K],Ņ{q-C)G~Ӈ>L)/G^ΆM*:o0Ɵhb]l&f6/x/QړNOxv'iC5\Zoy\ V۲0gF# #s/me.CSGr,`z6ڈy`smk,xI AsUbUI .L"dV V:xsj$T'Ll$^7utBDrޓcuxo g j4rѬåUTBS9 jZZ-NE*7UycX,wIhV` ƺM3>csԅ<PTٛT jueIb˧}+;d'-Ϣlh2eD-CrL1oycL{RgL9¾>5_f22!w |`%4ذJz7{@PO ֠ލE9\y+b#uV [Gx %w{I#+_Jj]nI AC$x/@$$lq]XcHԉ T.dPY3\7Sv]j&Ou6(<">IV~dbO&B80fz-#-z{ Mm`tL,7]"g ~s.;mo<}cYr60v+W6&H2 JJW!^$)mgZY]=g/eAH$V.'&s0i9QH&^6fV3}"H ܬuQ#jy\!9!P M(&Dؔ71 T .ǔ)M/=<(vQ~0IyۮenXm< bя'!@Ķ5#/Կ4?;N_&!^1~O>oF[''E}3r5wQɖa):4fm1Ix1āaa7YGWIxB2FxQ3hkVIhݶ;l&y1-/>xCKg5W#Oo_vURWBRs6@O״\ԶrhQqsޗa} `(nqxM(R[b"jGͭ$p!*EH\kJ=v!&jib;c$:'-hȬӮZ5 RHO4f2a@n#^'\, vmNiw!l ss썹1El /f}=y G$ 9[7bZ ,qU$o'=q`zH $2Dfא y=h>*G4AHz KlH eX6a2΅HeUlP IQ9)O拑uHt[ $DCsmY#LPM>}81Z5 ,G{qrU ( n9l7x[_}^.{k^bEA&Ong_yxu3cA~O7V >F;AZR0IC2,)&ep[JjMy9hNAs0{wM2a2;l)Ì&<WL.}48Z'S wy8@^:qu ^wu9x,*ST]e*3TWI]! ͣQWH+ֈWWٯn"QWh?]ko9+B ,Eȇl2;3 b2|Bd#x/ò-ɲݶqp["Yj\䦷H\+֢__}pSG?z$c/Zf4e8OGM+århw5E"]MHWS)ҮtY稸jt5E"]MHWS)rG9Eɳc+)mi"w H] RW+ԕ@J u%H] RW+.5>]w4 q"WDmfJ:+b1);oQ{NO<^SVۿj7ׯ6'^꾧i_cM'ʱ(b[#BYfG?;Q\vҭNe<* dE)rʠ=r¢w\Qd{{OpI3qϹ]w3}u#t=r!ʝOOfg@U @f OI+ NiH-H[ ȋTd\JgQ w'sm.t8}> LN# GFs [pcmky7DGiA^]k}u㶰>EQn+:rO%m>}aVGk5;X?G,+-".+V3%2,|!{Y ,H?"v;iF}Zؼ#sڰ ؋2`M&Zp6y"]"!"F ydE(}fPF(zK A>D8 =qVT1 i'$eQ6LgGxÛȔ;$}]G^?Myx~k͗Kw7Ҹ\kѝlr5EVS}D6qrIH,ĠSi9)ZD8戒Wt>|R]EJ]mX[(e 1뼶řrAEf3f:֍|9 .c%5M[^[njL{)?JN Ąe$/#pg|0G Anrcb uDpNy R* Ì7ܦcp[.e"-f9r+nnx1qv t\@=˗UJҖjYI߸^0puǃ(F@C2Dlv/IFK, ېT6@W/&'Iz2$ɵW#YR&7%݆>낖EqC7.B.e1=yuVG/NǗZۆo{QViз669vGKM5RDQT;n+hÒÓ%`Yh2T&wmcYژ L0 c^DH,&JL 426&Wi ӌb!6'¥DoŒ_ivN`swry A2''s"A@h6JrAh0if8^萜P,ӇfVȒ8{.M"Q`4-ɷcf!-1,C,i:muS܍~4+.桠vcP6 Pc[ֹwHeQ3lDPrDD!(Aȼ !(fo!3䒼 kbI8rt9k_ꄮ,s7N}9h6TP1CV bxPK)x8-ױ5A$"<F^QJ򍓒m!>tҜ6TK$ \)L%3.Wts]ݯ vhu@Y^Dl3t|2ZֲR),0+0d"$iQ5j=m)D]JIɒ2f@F~BӁJ!$J)X#Dh~y.u~UG$mM,r}CeRWb\ rRP$o@zRFV+\#ā EsR.C{Ync[Bߜwv:l;,dAYth"x5fglc 2NeŖf,&!Z:d=MBv֘8;eqYl(*]>1.*5$g ڥl0ƃC482I|&緢s&D("O^.F RN:3]GZa:O`SR0iakm8S|Y/ ,'18y9H`U%e]9?áDj9E%R+3==U_uE$DPLh iMՊ+UT 4iAΔbбW(*OB`oJƪ&2aA1!d! L4dƺ륭Jr0C:cmt,|ˍ6 X@h-gN2gh&fFEeAdpLѝx܂4l+coqӴ, ĨIU7{őj QN 9V4t^uTBje㭉鐎:K@5)H ѡV0ep'mHr9+-s#X^A Vk3sѻ+-dC|L={!LI9]{᥌ dp"V_b.ou+7gyd݇*Kfg$,2Db:delb12W^rInqMIA"YDXnR ):2\{Z)R My|td\wc^[BF"w↑ZQru~sǃM|7ٸ Ive dBJQH, 5>hB^x%ߌ60Z|&( 9Nї2mӽ؉xcIl}͍Ӈ;yL>U6KmnfKٮJsk6~)9-@RObt]mjF(??&N~+vӟ?etxr qLbSiv#W3~ߟ~n'jN^]x Fi\Ԕ)&OZ5z[ÿkc$t1Z74n\?/O_7w~!/XL`-A>MFӹ}O/Eĕ 2|݀qp34%^ 0Ǔl?Jj oհt69/^vԘ-u #lEk^XE k;jZv;jNdv&jdR >&cJeΞiUIeك6hf'lTv}JdU^Hz"0rERT2w]+r7 vfx#/~؟3ts<}n򰷻^0eӛ/d^k˔~z> kZ*cK^y%YFƲo>ۢ~#t3)#fI<: +Z:J%ʨ%hPƒi+S8i{׋+8.[?|L6%~-= ܟv^/G}~lesCQx5I -#*e/LH!%a/%C&Š{V F{ [Q4ƻR-"DbXF!TvA8 zNkF>^i{u2SB|d]Kwu ` 5GN+1*N& AO` `iE7y mxQ<Wo \\7o143IatQy,ꃴ|}Hg&n&ME9Nlbu^s]H@*IBΪʺb'лe?~ "0: &x;1)Ԏ@@j6dNpe ۦB6Խ,[~5kpNyf5 $%p;M2(O.$y}d*=VR`K)VwY3N}ܩ]hLl"p{,&v؅J-z Bi)+"+= uUXU졫Bwߢ"s9*B.FVCWWJ#{uh!*:uUUXUVCWWJz W;"uU6hU!"j gj*mޠRzelB.?cP+]*zuՕL,Yv_i|Wn2-lI7kOJw>W8=mkO_6)suBS^w`_$,(e<᧿|?߼o$qJE}.@˱;ewnt3S 3QkcBdLZGX\@߉_԰"OxU|s2ch4ISt{i{(8ݧ+$ 4چ l;^%+댩V>GZrWZSjP[zr;EIii28K$ Ad`Fؔ\6F<c"nV2 &"uVA"r!%3=E}Q6W XeBk'L*tF:$HdDCzpMO~PE>/}r:>E {6 aϞmQ9 *d^ ]]yx{/-ʶXjw~㵛We}U_~ݦ킏5KP\bJɘ(ո`Viy@SDAEo3 >^Iis0& 0d]nes .{3YF+t6h .xw:6Җv—ޚh'i)9 !auOḬq<3jWϴ"T'}7>MւTmS?.Gϣ <ֳ |.6N:N{]⥅vk}H념ay'3E!WxjݻP}0қ(tee yIW܉(C IV`2HH m|qVW/EKNd򄁃P1Y ":2ڔLNEB$%F =:p" ]hW]vpVe}%0spu\ࣷ_"6HݻOpNKLS΁κpƬ*,W3zkDRTYe~𳇟 ?#\JR9xDd4R "9iKZN7 h;{sMzarpz1r?\1}ݠu_h*}lTy._-'mxJ 6ZZJ%p [&6xQ$I3jT嘽\YK)3f&d* J:X;#q ^^9głw[y)j,_Ȳ`YP{ggr=}:演Fj|Azu0ޅwa{ԅ>kT#YkYJ{HBJ pu\nXESrZl:P9}\( Ҏ S=CۣGq8{[Ѳa!FR&GyL,xQE-C96qR (UtORRa4Ǒ#׿ɎpK@:z4 Q$MRz?t7iYČ$v: |^Jln$UH7뺩5sԫ-]JY\\fΑ5}[},><:Ȇ?=1P{hgC5o6Gq x& Qpc-<\5lm gg6"zz">ڊS aQBf5:,sj{{XZڮVx5ւ.uYmQ ?O)_Kd%_hkqEnP;d@gӷ|fYki9^\ߺgi0/Rfm(rd M֎ն ԜSʦsMǖԴ&ZI :@OӤ)pY\- t_h#"_:@j62N302*Ӱpg< y2,Xxgxm 媞^knD= Ve$dfS)&s0cT IL0DhBd={ {&$ ؤD_Τܝ`XkF9>0˜È-|.=;i2j~a<>Z)+(Ju]kgi`rf;RZCl5*tTpFh-h(*y)l:Q?s?0\^mY:gyq1ɸ_pȡIoZSd{)do--1h A .~\|.x8; nx K{]i9`5XOG>bkcJ>D`p6i7̵gi7u7V4lޗس y(<7?pmnI߱S?~#n׼}Kݷ5*eHKߥ¸Vn/TJg'MMطb/}mQ`y}dEpWlu}sqPWpAghp_N Z.͉\_N8@}Tn۳.@Cՙ=_^뻢iN኷3k}m[Q~BߗE|Fct~qR'?0+O8|@xFmצ;}`2PCsp$|I 7QՉ}$3Zg"k =bB~ʝק-~'Gzg^O^Vzqk/b՘l{ٷԅag#rl1 ok)D` g'EMLVF%26\"RVrp֏Ra,iimp 짹e)j[7a2e+30ŕnIJ[Ukl_99 z.-| L`gGKU鱂?j;V)VEԷY]@ɂZ$L>u`e0a`sdI0(X{*@:Te ÃTH"r/O5Z1zȃVJ$D!2ӪGu(|`Io 32]^63h.zOxIH \'&[-x$^ đ~X:dAX?3 \k ymN/2!}8WggvGq'8ԋ'Qb-=!xC\d.`A/hC$*Ú#t Pa 6 ?oA [ ͕7Z!FcNY gj,L ޶H@w[5 '*/;y俩r"6B#/^mȎ,+ 3ca(VHm0z~X@G 1w8@Ug =6FVy̦%h pN0ـ+G"ELm@hY 7kj0yMvX-ށLє2 WU$F'5Ia<,T굻(o^\[! |ͭW 0Pڀ@D ˞*E(k΀| .dN F: ]Ov6rH)ĩ>[j%`t.'â Lܻ~T1x<6|Zh!qhiM5jJCo706IM5}NOO3w~l|j`Xr+Vݮ Ro?>sv|t\ۻ$߻[Ba{Zlj 2v.x: B`́<{k14n!+㮦 .\_Gc 9a1(LKu!0e\Ա@sYQ3+ H)m6.!;ZggBAŹWONǞȇӳ)8vB͍"ݜ\|\nm]5 ɿmLkE5듿m/K.kOWyiyדտqb` ؖ%^]Woʅ%}U{9 U;˫ghZ4 5Iicw=$YXefVGI{{nͨZiVR< 6PU{#1& vq]tDdf<&n.i@c+28XzQK)UOKS ݶ;ugsʵ)f۴v&zx|qu+V(ltT9+#^a֊3xJ=G{^ݛ\^v'?|{=>bp1e,s>r:y=ۅ"jǭS7?4FUf5 jXw5;/0c\s<J~98<{У:񺛳VF:ɮZBQJ1J@rJ#rl#ڻ,ym-?};Bq^<Ӊo^\2I=He bQ|rx1G0-:mZs zo[&Bɟ~߽ߕ|||[X e,WG]MPMy hOW=9NQ57Iղ]O&ꚷQ4xEl1v@׾pW1ZMIwgKH~V©2 ej -q~cc؟ gd2IHb VFb ڲtn˫Bu*ift] I(. \l|9{["{ݑLشC><^@"A&uZQgS2Sx)I,;2oh3#2:bC*P4j}b+(;?Լ'/l9s xx齛."+b@2}|+!"A"ڃqE2F,+RKtEzHV[Q^3yMpͧNLPbg6BeiBZfmhHhHeڙy1,]h23sD;\9QIy&Um aMZyNi~ Ќťg8G)B/mr B^ d9i1RuB,!2t At At At At At At At At At At At At At At At At At At At At At At A7b0u@uha`륹`륵R;sRYdA:5764 ۮr[*pi1>}5 @# DIbaż\ʢ)M*dFNѠorh1f7?j%zQO j#payOhdx$^EͨiD u٩"%½Ӧ Sִ"s;["Db&3ӧm͙s&E *3ch$NJ*dI+%ƆRtˋޙi{~8/X4 GمsJqDiQp{S7S`jɇ/ifY =$Ӕ=35qU07MZ.mRFx&}n[ڥqM0b>~.L1uFEvܮb/tvy-6y"ZmV̳^t: {Vr:{ ~:ƗWWmrq5 FMòV~iA%wVM:*R'tU"w]W,[mVvWZ*ۭn3{pKw@K֞l;@<=M"Sh_C&8c䚦LvܦDǃAu&LabMaՏ F|tP[eIao|0x@ӫ4G kfwZbGnbFmk*imO;u0@tiD]vWdfKi_x&ԣ[862m*Ad VǨEtSdeI _w7lR,~}8*gaAfD^MT DD)3i&sF֡hWe*yIʤWz`'(9a󶑫md6" oOٜ/r\s[JŊ!+_ɏ)%!!!!!!!!!!!!!!!!!!!!!!#dBXrH~L@`}8~$!V#V.&#V#ۉb3TO"FPg!9:w̧Q|rݢE;q۲-6ԖnۙYNvCV %Q$UΘyf 7(& ,uĩq^^97d+K?C;TJC5e&,^c䅯£U3Jf?I:/=^L]/WTyAf&oJ::^ߣ9BˉWm/;Wj#q9TXdaȒ9BRNNV {gYyA8yUqUkΆW_iLiMG-"/06reT.se9LSHW.e+9!4 )F0XtxG9n?7_$ xrIt3>0BQ(5Y(LU\2Q=#9{VP+'!N˔“d%]Kuy` zQiDh9SAG8L{%WOl%Q]So^޳B#M$(LHa1N,\ל'Z ގz* /z/ZvʃLF]@& 8Nt8c5UU2[IOIãcM/ZvtʣþMQYX;?q<,.iK~{/E;?.픢m"@;KH@W-egQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQvegQv Q! QΖ-yq0ZI^vZ$ξEY .;;[DnGx;tٟSjBAVsIZY4jM"7"@0`Jh,5j %m"Z O7T"]MٔA 俶!  ]~n~ .o w;*;ߜ~%)z̮J5i;18@1ߤ?r ,>.@]&?JXy4ϙ. tսto#Hߊ)o$}#IHF7o$}#IHF7o$}#IHF7o$}#IHF7o$}#IHF7o$}#IH~;%Ez>AI5]R.$O~RI}a uHuŬVQZs0u{:R䨿Efqԗּmn_)C1DF$]ۜ%+#RӟCMWڞ}mE8]wq{4ϒ#iIx1'&+L8?@W:del"2Ur!6yd>՚9̉03`o#7ASqSh=eǗ78o>9O?bZF)8ZXJ_)\~WیY_rli\\1d5)qDUL9)[gN0v!Mp/^NUhXbuVŷ(JFQΫ\uiLsuf>7 '&IrFHi$wdV2s8+0?>A\edShT}r~]VjK}KJ*%KuBI0r-LªL-0˜V kU 5ޅWI^a>5d[_ˋF~ge߼ >U;X.(1q9GnSI`R襶V4*0puMQ> f`^D_uu!{:t'k)`^eU2uDUKzWk3S[Sm.cn[ĮjC쪎WO%xZk?n*Z(TܽzB*Tm_=ޑdH[ '\a(1%P6>"1ZD0r tɇ-^sE0𓂕Z#Ԑ p++*}LZ8FEZ9WV"/e""޹5LҐvo/˝Cλd`nS;s J:0GB:0WZJ)̽E0)f`պ?wx6| O+ o!~zs})g_Cp@ )ldHZ>3(8K! j90OpbUWCͯ"֜YhuoF31-o $>=a+Ջ "WBji]߶>^'6hP*x}=04&zp-85=;Jݙuכ!ZPۭ뮲UΒZz%fZ }i^{YG>ɣ5q{.: &eJ-LHQyMwx2T68k3Ɂw=AaK8 fNVa^ X͍m컓A.@A3#W¤RVTJLf%|a)A(!ДRWNYeE2P%d4tHwq)m 4%4]ʎWʮwʱGl=CG1b \}gމhȼܝi5cDV ,7*2gE1,cqfw𳃟<@$;FƣV,$8!$元KiT.SO`en>Kqn'NĺZ\g6{ѹ_W!X= 4RY-J%HlYp.õ$+Rpst!kPU%8:sY6<(Ŷֻl=BG#-A,-kq0"ǣ4o /=Zma!m֢bWtYR[W?}!EVF[o3z[;2$(:4F (xCX4nF -!wge nʺ!}wheEW=a:izߏx=6>`eE6xZnYĘ haAKR$#5\p>I M{D. h E FPRݐtDcIU^~eʝ煒>ٮԺqS&ׇX_ִ "1aNNQCV01جe6 <$ʂ)h^Jdz]dZ/Fa ~=%Gвm=.YfQ -뗿ioMHH<?FE&>_^̤ٟg].')g@#~׻bLפdRGܒ1PSL`2Oh2%%xr̟-WY#c<2gȠ{\!jsZ~<'<.\i!sn:?|!~i=u?yԷdJ/lŋӓӕ `%NrujEC.>5y@h j9Gʜ{HmuϞi:~28[CL%1'kfk+^r 9Q?rq8|4#)~MÈø,oIG,XvppXŘt8`G]LrӨjcuR:p!#e`M8&rxğ#NBv*ٲRJWCKףr;W.![_KhFYZ]\׮s̖M?=??͏Wo_{}+ DeD& ^kho1juY|q75x$*mz6ViMߕfsON2\<.h/񦃱?gg5ZT2ӉYµFgO><(o  |C*gd(C$SPh3h2e9XZnuYh¦?C<Ҝ!" .{ 2sK==m2% nv) g!gdߍ7;FjĊc!فm>]?Xw7=: +.Wk0`o0VvSiyMkS<*f:>(hCh@(R@T \42T%:<.r<$ݙeȒlJ;H@]%k4~33iv:6U0TBh(2J`lNLc;њڝh-Nh<1m1D!Z9A Y B%@zprR Z\,,r2Cp\)"d.s>_u0썫FΞ^*WjAa zmY#*W~tb1B&~$<)hI6 E/d:&nsT (!NܐdQg+wo < $$@4) :6W`r-rv *% d.CqWWQ]H6[uY-+o{0i[Ē=d;[Tw-pp,Pq^$C@Q:%@J!dtU5S!rXTF]#LHϤjkj֌J5]X3n .4.S]mzV$!vM'qg4~0}''z H29eV\ )gix`l@F $3ĺy`̊{>L"%!EQ@4*@l;f]29bfl<٭h.qMm;5 /D)CVD C@DZ,d .&< 2WUfm`5Wd!2䊬(Hg(l4a>$NkV5͍q?aQG;(4b6̆((MX& }rLR'EP$z@A) BH<^ x(w>TXQZ|2œv+?L( |WH?$L"KKJD.aV8]TvP"}{ki}S{z u=[IGl>4mۻ.~#"\LҍqjPvu3k{ZdG6;rl; \GEݝύ=zr;?bS[<ݞb^ql+9|GZIлe[u-Ys1E}4m x {ckućq%J^eWISTɨhm*O^ySWk-}Z/f!-E$g h$8J^HE JfI5B[LD9{#J!@4@)1R`,J( 1'Ta_$W7֎Fw&IPL٩col";/wT6oJ+R{o9N *TQ3kM $f#㪔0t[MrB5PIp'gϳѡ0EXky`ykv 3Bp` k 41ϼn_[lJ娴kާ}]"SH6UnTd :& ].Fh"OFz5N1TGh;Q꘼c,3I&(K*t9sr9p- s#ݲ:F{u6XmޓVswCQ(@fyެ+WK|RkO>D|C_jm>+Ivef`1 鲇Dۏ*svNh!SBNx=%?Mk`?{6MCy& 0`_}5%(98}IJlQ-ʖlHbTwY]]_uW59l1h m:K^R)mo߀au|ھp z>S1 a$'c.ۀqpʹ.c\L2BG#0NzԱ) BXzƘq䙒3,'GIQqAankiQ#zPTa//5eDDL `#2=iz4e_?OrIu-skyZCa(dSe|Vq9f+kqCnl+U)+;q4$7MSn Iåvixs32k) ,~RB/R[{Hl730ڨ"yjyҤyڼlҼbp<; :x H}|05Ix&^n] F6k2"R4{>6&}eoC7ovaRt4EZWu! UnYjk]5|ZX=(k{(Yk\fSڶEkݺ)yn^z]TK֞}dQo~OÅ9Rޓ]|2μ L.E`b*x&ycp0={ҙ'oT[ew,S1Yyڬ4 =ZLV=/DE f0h޺ncD'!ڈtQFE^_#G7;=Vv*y-F TZF+A|ϲqiiKm`x/!D6p1HcfhE pG;׊KYW[ӍX\>H}[ V=O⯛SiӐ&0B'MHkz0 PH;iBRu /4+B< s`UWC1WIZ\%)5 4WBq,+`t0*Kȡ$-{J$Йc$V\xWYRPULp(G>fO]ڄMn >xX>E[>,ZxJgSOWO, _>mDu[a=6G,\|gx'z~A(-6Dk]#X44( |CfjG?@`(F,*ͥ&?o 0R#7 w}c0_$rdz~y2[]xTҀ<͇b|w[1(UYvۻgAP̗ן2/66eW=P<,wVpO.W6w^0;{M\ l eR1EYy3;˜)l^_` K^Q7JE!I߆°5a{[cR;ĬW ^ wE%K`%*Wm7y-o?Â?F_4/Okwu)ܗ"PDz߆A. |jݫ/͟ïհfk;ɾ*~/מ6O*2RNUݳ ;{0o>>x6Z^Xh,]lMĖLtۖyY hV\*gQ1Luf Y)J׹T]H՞ 0#a,DFw\h0 $aAl"hR1D<[֑aYqGU.k%%RHDc)9E,tdI%sY`cg525q07e|"ST.qTRLG[{.F)O$ q3aV$sHx"ٹg&&2~-F#Q"A%Nr<1H愥hӾZ*;^ %(K5mG֍R{grz^Հhr8}HC&\R*(S. $ s (ƌ!m2NcXhm QFGR&Utȴ<[g ^g=9A\;ա5Sr !K yqq9'j,[@ CYڤ",QNkc;!=E(2@zQHD@!չbĥ CiHQv iwBs( eZ݀IDk1N!5[!--99lY _ Vofz3`IZ Y n+y nCA MWpGifŕrH7rh%Wz.mf+JL u1xigl{Efyo.6nx|ss|\ շ7:Hc/MϪ W&ZcՅNbjv8 jBdc1Wk֡ ȭO5$s~;١iw~|IJN/患C3w arkVjEe`De{idcI$Ff!݅9@vv'nkWq" }^w{Rk'XF>s&%ȥuNS2O73<"#RtZFǤ"9̰4N+ L (O3@20^wX((nASxXv&kF(m11 Pj &!yDmetQs*:?'Nkij2 JpĈkg1 ]@@"O"1`#TH-OT4h.L4  Bod". *(WU%h.|YFWE*0 `IzbOJ07)>\fWPW,һ&S>O,vKGmaIA)y| F8 tV] f2fqkrTa0 }wZlY Nᴤ_{7 WpT#v[^a޼9^O?@EՌ\c.rGH2h%簄+U/g]^^rz^}~9[!Lsa.0gR}3 "Poӂ0woBu=IR{whQY>Da8*l`Ympp=Eᴮ할׺gUEyw 9RTq4)X.s_6N1ʍٕR4o⇜[YJ~n?s A]J[xE޹-OKQRWze=2b{jukUY>bX^&?~?黏Na9}~~k?LKdAu"o'B;]]KV߬kXyO] x~e\4x&D0GфV^~ۜm`\6?{%?+à z#~ ߢ|٠NI .p1B++fb DK,(9@b w#3KS)B2% *5R`)]&H r1(;I j/qo6ۄjejJ=*sGYqރx& ڄ[CW]z蓤*0|W"EJct2򰴗;kHE>ӒĐNՄ|d-'Eut/"ߝ580sȥ" +c6 5 #e t"Z1JwH=KE^6M.p_Ȗ#ɞL+d=l٦lXģUSl6SŪbB(6YC}E2ܵ"FN;e սQ [p19%3hJ1L"Ef!RxOAh["da ~}:qC,R&Jh6xr4m"JUˀjЂ[l Dl6'U q5D8͟2d"$cbS&GD<𣡢6͊J51-٧^vLNJJJߥB= ܾ ~8*߼t(MH[t"`=_CZ(5Jڋ+XE;J(oaɹ(ju.-C%\|T9@@VZHi l87#c; I]cpXx~2㽫aɧyZz 0LO3GXϗ&Ԩu`WP+F`]:Zf͌RNi^L$-& BF3EgeK"GJ͈G+L:ڮ1j{m/UI`,N%XDDD;$e} ,I:v^S<,bH l!33Vt҄5h),G8"ۚlT'$4x5sc6TqW| 8 @b1'(ͪZ 6ŨCLlJzOIeS2%"w&&d@K_,JT8sLȒlQؒVL^JH*A͆s3"K#VN:[IqшƸ\p񆳴XHcFJNbL1ADR L e$.nwIǎᪿ[ ObY.n~|G 檰Q^[H\eN C+ǧĮ~W8ϢHmZ AGr֠5ɛ(Bk* H١m1b@RʶYkK&#^*/%:`QKf2v6N;ݮD˪U4c{Tx2EoDZ_*lqEV,TM,Qt" BoM0؂I͊-/%%F;hn3jгA I٪X":12 *䃳x VX1ztYo^ESBYd['P%hTZzM$j;W6%U#BgmE B¤Z^lݓ j]գkXbjy&2l `*yEQrN<̜l2+y$K2 ZYT("JL(z ȐTc'^cLMOw;$TY}fIVӣ>[+CCÇN嚫w'M hús*HDYi_f^~lQD*yewĄQ{{ˌ4&fbL C7:X#ස}HW>M֟LhN%F*`#{JYoN`WΜ^^_XJ&^|58)<c:>kN)juo=~-zQhӂGV2 {=뗿K97^3/ы?)u+ )x|* ]5lRo^NI>:7y%avkׯzhwKg/hq,NfG% F_|h̖twKdv>jH0.F+/~ǫwJ^ sVM'$QDZ|:1jI;u`|@B$S)zp|\_H4s;=X{"g’%3dc c(ʐN3)'XͺCms3#v ,b&WFA"*wY$\`76x gnf1_><x͍Um~W_+TySʱP.$}ZsuD(%kO%PAʱU9뚜i8+VK|.:zz'GGxvTu]Yzfvj{W Te=3هݤw*|= Af;9M%0^ycu0QwD e Q']F%\)&ԥDAZ jF͟& ‚d̒9w<Qig9ɑMA Xolui]pnf+\~^xO/\>3O;ZFוíA\@d^&0ˍ$f૏WAajGw-HK1iբ3&!@Z?Sjw'Bfl6гzyncEebncSIHu?NX&"0^;I$KQ` cT?p=2r/g:2`0tϮHfk=d! )Jea0RR>vkZ :SNZ%[Dg1qά/7?iTf;K~l`7>*!{%tƦH[ϣK1-Rj+]_ކO˿[\2i-DݫM7&;xFk noB z}%ӴKe6r]׭qI߰Ɍ>>==^PJ|úVuL;ZѲڑ~Q[~uq!u]} ![ :b^'jkN../5DsS xp[Yx]k"9gGL Z]rWBKԮs(|'() Q<s]''gioiFx_jޜիoH.FaCLL]@:eDIɷh1쏾ڡ)OBtű_JSe͢})_E/|e|XJ㴺"^*Ya4=^oqGք}T77hu6yK꫶St &{~Nۊ9OfBtWI'^d Uqe"YR yVXr.8:dY/ue>\*[JKϼ"9{?N7_iȹ2t)WӘY͚Ni )vk>~V' u^5)V6x1L|or# Z&ʦ^3Ä]c޻I`k'a/&߿}q'i %vBw;p^T)Q gHm+2znDO~WA^>FA+IB|6o_};Z:y-?Lh<>2 hЮw{nr=lu&Lj򴸭 Nﰂ)6] |6M{SQqWW8'Ck=RB1W Lkf*pj#);0 z9H1ծ*j؋:vPa'bG(f%GaQۉ wʱn;N)GJ- ip.Ovj|H^hhB^v@W_Xe_b;5F@c8|,Da^m;mЮkf@1|RN(63_;>;CѿnU$=@UӏOdBKKHE*'tҙ21ɜA;2̜ jFyQ^i#˝`)9AFQ!ʐM3!`ժ{E;IP\rVOfe4LX . ֖޹ߞK0#2jq,&6JYJژoDqLAN!h*KٱUC,h-• ~Dp%aD \" \BC+Rr+ITJ h Ņに,WYJٰW!fH`MUc,o Vzmًsj^/ŹW_vo*}VǜY{ZT`h!+ ,>dEqAcAV{YqA7hP*AWu{~VK|X |/Q_| E1/g`Ts ƺ)T:I)$T\gAϹ3fYߣS_NH6uRDyk ~ZBg' =ۿ۽;|e7~1+\ByծXC߭0~ ϫ|8,N;bfF7[!⁲FI+t"".A\j")6*v{,{ jqQ`KIwY&iE42jj1LL451'(w[ "RH)",aS9Vm`|m 8J{:(z2j hKog]ֽJ~xDg\nwIy=]hծVGs\{r\9L^O^]^xʓPd?ƭxBnqP[cYh0|2\D+ȋRhTӥTȔT(KbώwT|{cQ"CUwP;X\Cl2r18}J& c4khF#y|PF) Po2MH_$֎=e.;Npj24/iPVCP8kz~0^)w,K%. zf*PhuѓD;wZ: l%)EeRvYp80/Sa(*wH(Y1琒`3=Oul{( '/@3h*HJgD{2junX)*eюZ3YFOQRR9Ӟ8),K%oI  ˤ\;{/v,nJ~~cةDaja5]#rE:SED4(<eL}e=Ua*>=O'xbLfW?T1S儝ޒKPSOmo\?@9mSM׊D+PpszB$x:qЛ;/.`tI}]bDZrv[[ɸgO~͸QeМp#٠?kUN4~J@Lz_C\+p *Ui/ţ6.Fg7#X>zza8p>KTIըoyGZ }~e;m{AȢ-I5,-,nЬ|Ɖb[a`żw;IGlݴ7 M:VF:iZUB`Zk?odKTt{a+#~FOit-io)TreC'.~|?In^ij+KVh(+q'a+h_qu,ZTۨq[W8Iϟ/~:ONj?}e?~O8.q:]$;xV~@T͵XjYgNPg5&TClG㵳>Ul1Hf  ry^<T`;bwI8Y[uZj&:aI Ærvu%s?=UJoBQ^䘁ؘ|o3H$C ,ɠykF AS@C$1Tnt@1"(HR$  bgH1um]vQU-51­ ^ %ID.8}Pw/ T HG4%߅h> P&m_DM5^f3{O#_mzcV!/P!Qy*gF&˦?cCG<7$q>JQGObt6pojEF!#f:$NBOk !ǂ(HbPTጤR؄ U72&jdԆiƶX5cjpX(~`|F ԿAG/޵[7.2`R|8S@TFHʂ;4B83I`7dcIe NHhmJ>&A׌ص]fLCAڴc[5nPaKOSgMJK>$pI3/pŧ K;ΫT`Ӓ RVk3PȴWHc:H FukkYQ/Z!f`<DM?Dl(1)h-T[6{ŏ!zolh4(Q NMR@yR"(Z{ Tx|wjg5"]Ej؟ >(_g]Z%.JR3.\\ѥ)PM! gePgqz5 mBA@p(hCڴ@p\x1QlSŗkkamEՐ'l)JQhS)ץVXR Ml1rhŹ9[&bj')kZIP1$Ǟ@&< FhŤ&@<%X^o8pFW}S:r J !1`QJeNYR_36q8z@8%\)/46=D!nj\ŝōP6!r!tt (iM*A)ljMI{N9fn/-~s3ggI$o"SybRRGbj9ޕ6r$ٿB.0՝GDfck/c0c4Tb2Ia}"$EK*"XEVTV2##\ҬȣhV$k2$$w,w (R.[KBUAu=3sg{Jɧeʹ}JO#E7ATJ BFQjȳL&v>]s NL(u?\|@ŅX %0Xe o\}B%J ZQ8:I62Y+%daJ`" hx xLF/M(Ld6I)g)G'W";$Yz pk. S1Ie%ZH&2'o1Pu7#¶4yߛ=Ju:0w0%)tޱh!W guJQ n$Hz=.^f<;c8 z>k8|JӞ,crAH8e@Bl`k 6ll:e"֨qJ8"SF06YBi jjͨͮO<-Xw[: >&rGk&If(AbJX5>30q so&VǤlfE#뫚.axr|_r@/j{=#Arsu5V?mO~-trAY @ Zư.=] w}TFS>3j(9D #8ʢ|&h_C,xtI>uIB_go-u$k~vmo嚪q{Itwu+bIi*&~LQD,#bVR;o7λ`ow/kb&r#=tYa+nn෽ٍ!) ϲױc2X`3 6h EchKC޺,8/-;* љە-˿Glr&i$4"!A 2')+'f&_tQ#v ,2_jk%K)RQTKD<8Avf= HL <;=ynf4wVs(id` %bB`$h3RJ&) AIhs uOu^v"/X|ZK,X>9#3_U_svu̶# u7jVOuGUjqd:|z>蛏$\\f@ w8 $Q*Q*L=ZQ W."i%u)AX/MUtAnwPtwAfdb0(kgKAB%r2t4+sf \~tlI ϵ ~W 4CKVnm"{fe-U3_yJbz#(łwɧl a}cy>O3׹-~$kQZlJ_jv.N]%ϝI7Ee.ykavu4aQ/@ǏG~lwX><3;s= || ހϽ"uWOɅ5hqy2XЕ?z#m- CE hks`hЙ(DQzB5;Oצ(yO)sA,L$@S&mT2M&YX K:ǎsދevPzeKۗɦfngxbz;6K`E|AVHC%(%!PZ鬩[6/[2=0`l`$i&q$#;Z,`$fRf﬉M}{$ן[˻bq,XG-4hP$&#Y Ȝk.83@<s)2BJꁧOfoS=kG{_qd.DVv'n0#W^tsi ՛=s>K!}񆅡U簭dfAq |RXBa1;{N !_Ɠw<y8矿HG6m]㬠FF;5m/ayG'֠IX(c„f޿ \# QdO|} >6RŕjX26h7";>A6Nko-k){4úe~sMs%D~f|t2nئޢ5 3ev&,0[,Z{u>m47Wܝ.; n3XaxDEWe\4⬻E?9ktMܴ:=0nr:bq'w*2YQbv}z KY$I))(kgd]6,$_V2.g}|B?btϢE=uA uڦ搈UZ!(9h’1[Y 9ʤLԃ^'W{`,^UYo<ꆭ|Z^ycLv9'Qr 7Vh>|1$wQ<h ,A\vKJZ;$Ӟ %\j/;SJk^^v}hQNi5[ŸŸ=ʯGͮ]OvU߯6~ȅ*g 2FBzK%H;M} 9!hR]ߌ$ޝm>B9]½6 6'aj-76{N񈛘aOYy|d]kh;<|*M }%EЫ'MnLp͝`!)os#(߽ĶϟϣhMla[aN? '.^&^+v,vk}>LkGLKj%f;.q0smY BìnJ\r}?z{aFyh1NY^ۀ%LSt&W(sW}xf(3F ʤL$&( ("R.NgoFQ"N"~&QuiSm(D+?oaeU+)b )H r@~o3uNj*ȩ&dš?=SC&kIxd|2(aEsd D+):E}UT|I+EnMB7/x瞾lM-hSgoLk.?huA9&RVkGeVYD*ٹ% nEX 4D^mTB\^ZF=;UwsȳdzFA^;XmQl7kVQޢ+zee)C'h 6FZk5䡬V}_kVZ_Z)-\'4FpJH`%`ચšUUDW/}@pvW\w0bJ;\U+p 8 _ɂp8}Z({zpe+SCiG Vs`UT{AW/,:E2kjCj-.L•3*bQj9vUl=zpE $\+PZ^ V+{z6pE|J[ fu̥2yXp¼[C*A'  zS36Ip`h:V})^^TKq4Z01E2LDr/#JF7A|{e,½ee۰2τ 3*P$Ć-tgDXgf/++Cz/+@ƜOKx Bl7!65֗rClTه؝ lܧ Tx+˽ UZ㪃X0pW(*_pj5# Ur㪃F*i<ܟ(Գٚ{W0㪃2@0?lFr ՞Mf*qA\)˩o xW upjWR.J[ͽ]`O* Pq>D}@e(ן'V9+TizʞRxސWk*Xa#j,WrIj=.sj}n]LO >  .=꣌pj5qW"ԋ6,W(Wr_pjO2Jqu,\qz5D\\/H+:8ЊWĕVz"C3kȥ+T˅B $B֟ȵěeu\Jn{\uWi}@\\+ QJ1CqA\iɻ8z+ Qe T*z\uWK|J`7BV+aPS]ĕ\Ϡ N5Ť\sQZh\k\-e *r%lC$,i*Pđ mDID2Cb_&߯Fv߼8K棲zG>(i˺lo\J8Bc8[ KBKmNTRk"h3G5ԝdsi6A,l PL.,뽬V^y:QRϞY:kB= Թѣ}g Z[h5+Ye\Juv- v+=QJ`ulvP"'3~MkU ?q=[qs8uSgeǂ}q5N'(ü?E`Ac .*|uXOEer<,k̝.? LwFiʷ<NO{ɢ]&1T{'tYLC, ~/m9b>ٵ&[g8/M Z-xuj4ar)rkl7,ꚟ:^^YJ?5WF§H# Hx;W-6_̶s4=d;-w M1ab #=1j{W^ֺFpӭ\n%>hڭ6m-㽹:;6mwokmen]q@Q)x l۞"6Psv\mPƗ%㇂rYP޽NCW|G6BSX>0H-%'ZŪJ\u`zGjz5,q]9~w- RoRVq*+v+qk _j&k4kOjoVQy\5S[Y7q)pE)\\J/B]5SL.㦫l,Ԭ`;5KTqf+^sr7hF*<5i~FjdNZqy{s5R##sogHGƨ>eAf(W_BlT{wޛtm5>>J͙Jx+|&e\Z|FUFq%Zz+,rC[BJ+Tk *ĕD90`r7BR+Tĕur辸By+{'5@%%Uqħ)D Q.f6ҵ71{\WFvUnz+˩/BB+T)`;+ѻ(x Z| T "{\=\C"41 \4{'Ԟ*wH%s WD]MOx+* , J P TR׶qu\1ɔ/7?4S+ֵm{\W\jҍ+ʼe\SsL=:+(>@0%P.վ rsWR`q%7z+7xjGL+T)hJ)eW(Xopr7ޕRp:Pe]uWz)8]z++IZ|0*m]uWF "G \DT˜2*yj"V]`rW(xdZ+Pi(qlp%k^ڷ㪙o6H'~ŹZ~wtAWծSooT.Qy~˜~ ob%upSf,p?V>6ʦߕşYVq-~;MCXeT .U7Bep"%3\B#$y=sXbJrH 3sXH}T[>@x@xZzO t01,=R +(W[_pjyLPIqA\q+\X`Ή7BG}?w/\C/~?@Uq%fFx+9W(Wp_pjBZ !p?Bʛ5u\Jz\uWpíGx-\\A}UΧQQ=+M#hv&=YL%|X ūu4;g"h8ݺ=yDsmPϋ|5#:Trχ>|j-"}?/\5!FƄBƙX )4 %\^J3.Q<m<,$ol9_~_Y)ҫ>o-ds;9xN*vgWvwmMmn8jVlFaq~v%\"1@ZDH$qia6:GG7*61gYfgE2 Ő`ډ"5F0>^~^lq{3_0_6䃎J.VÞv{@*\65|-.I9yTpGIoP`mKCZGl;Xoˮr^,,RD0 x.ScT[.)Ee5)DQN^lNUoq'A];Rա|Ȕ3wzjKq|{SUl1*:Ϣ0ZOsb&ʈp25<,'*mv$7g[ВIvslHRnn<A|guƟØ3{6aNfpj,G3$qjHo⭶no\>x R=]&(e_G@Yj* EVY ΂qc@|kF8-^~&' l^ 1 ^f䙑eSKsԒFv 5WbՠPD^ъ&#ӪZ6Mr黿-;_>0^ V_Ӹm3ɯŏ]G 1Cp j.W/Bn.z"u !YKh1dDm5V\:f,46vwEmEo?- #W$t:*7A0+SM2>PB)BZ  ]i$껸v|.ZeGQ=ut /qᎁW:H& ALKHM{Zbi2";DOڈOP9YL\LudЎũ~vgR>tgPZIc`Bb<#h$元Is9V?19mD<4wr]zfD|a雟4C 1r~f=zjn. FQ{4%4HiZxmT$D,8ZIRL5V%8:s>q<(Ŷֻl=BG#= U̷,_7⬯ ł\{YB6Eî,˧,S`߼h䯬@azٖ>>'AcX;%V$id̆~pGʖlw}.Ə-۝Oζd /h1LH@>Nm2kQCE 9$]"v$.yYd u>O6\UZy *Emɶ90e{r"P fij7JzIV>{!Ho 8U Zbi֛bi6r~fl%ϫ_W5~eweQX\/UՃ_b~p4^VI ?ZUV>E;~Vz(ޮhlAljfn 9 H`]foe[y;z;x[\!ujg![9<2 \@hEyN b9#\e(@1ɼ1d+NΛZs̄({BB$ 0d`)5aJxN%2k7Σ ^oPRt\;$gJ8)t(gc22ykHJW8.sͅ29jf"2ms 6%01+,d|։w6qQygj.;;&: `2AbbSd&:\`$OV V0d'cG$c# uvY^ҭUVٯڱ[W=~gQD0ϧl!sɡ[ʘ/V[}9t] Oa/6('~uQ\cu(j%YhUGk :2\+ |30 ]ʅ.ޔ ˱p-s,UXزvYJC}_Vs\ddZP{ >C A0NDHi'P8͏-S ż-A8-ha6D+xR6'HǕB.2KA2z; ´U9n7TGtfƺG>^տi2&(Y#*AeX߸3A0IO ZMZRIFc21T `4<7$EJyЋ#@BDrR[v}̶/nT X:^Vҭɚ ^G%7lƸ=<_їxݲH]%xr[[[Sqsن% yZy|> g2%0k";$ cY$bIFתf4rJ9fE, ՃF!eDHd$k3)5ck׌atakqW]-BՅ ʳM. 2>LfsXe89&`RI J(tjDg9cƨ.2虐B N,iAWQ ٯ#@zq\τ|mqf-EN/w4b& ^g-N|h`R)x@;zXakq$p߭E_Q&G?.0GGYbiX:re2iߠLs%jK!KfTS_q[/0ǶSݷ76}:[ft=[=̾$r'iaM ;>%2_m-/VSZ7_Zֻthl:5vyxr7?dݝḋA!k9|Odz\.;,Xs&-?܌_sV6ٖײ֣VUS@Nќr8<rpe*ljb @ٽS.\DIQR^ P2X)]MRȠ'A@#uPf5 8Q^*ƶR.HJo|0b0n2kĄah,|8Ox9f7pmUݣrۨ۞2Ɩe#2R>?Gt}9R|x)ӝeB4JxZMaċ+~KV蕥0 Uih8:RtqM:xTC{*.ޓoT?_뷴}߯޾GZ OL0ԝH0$v?^zƛ-:dhS&qG2%k(g C TS,[kvM.z@Cs % ,`>D)l FW>'L16& u:Pzl\ԍnGyKgF{LJj)&sbvz etym촲3űj'tjQ諥Vw/v흯qUb 90ɜ27j `%"\V_kP.1J㧾⚢K2JC[k:ew;ԝ@:H3hJڔM&e!$TB%Ah"1AXLi#Iϲ])2 \BqXϤ^3+J61>32lQd| +gzq~ЧjwK H!cdudV$.pt*eI9=*T&3K<I6"3h/HFee!nB@]3˼ 3X-z=-kyt;ufive]#ZWv[ĵoEaikCߎZ6TjC[Ԇ6]m;zkmFELV$@v3@@vf1y ݞV[l; ߧ[[n˖mJjE與.__IfT?}82jgX@E`5b?!v/6xMk[ (SF5Is|Z2- ]qQcQuՉ!)R(ȭ *i%b[()ࢍAK*\sw0/ *t~-B_9|{1Ni29i&N;>Nu<{̼!U3O^{);^=OmjЪU0}9lDJ$gಔUVwpAXq-E93$xfsWid4VH7q .u!ieFz4w=sJ/`lt29ہgYt]unz@J?SuhAg=CCw ^wdwಃ섗zXUzfDRR(*ua%Th'EJllJFYC:=5}W[VgA8|?)ύuZxy;[޼~:OӖſ5-Sh1-7L9C;"|Zf|G?k{.lzܜ3X!-.6LVmth˼SS΀,8e%-;}SEWm=3C٧WK,ʨKXIH&8mZ$OoKIIoP@İOH] _^S 61rڦ,ARF٤!S2.z {9d)[M]2.FI`⠨ƃ7l=Zqq۪I%4~j{x6djm!<25-C'Sk~ S{dj*P؝o{*b.%pl0T PEhLD;U3i`&<ne3gH/z 1)Qh%e#K&k&#`im-;5:ݚmC.xo\\g]M.1<„R֞3v>F9+ZP1sDmaMۑI]iK7G4"<^w.jN^Mu*Qk`lbaDU_vCӪ ֚T'w9P^A7 M5#iK"$2N:*)$5ًUU}[DYhR 'jԁGi\nGP%vMg=6-ຍjjp yKmjRl)(F$(T* ʔ`5;`2Az//Ez ;OFV[bkHYs(TKk_.dߐLZ֚A|1!NP+ҡ51$Ƭ86$)+ ˌ&Q2$*ٗpt1UM&^H*ފd}:hbHTm8փ=hZrж~D+}}~(MD|m0̘%2E% 1pЂZ/+ѲD0dCd9QVI=G?X1'gvtjv|IXsYB|Ttrҍr3 Ѫ CmƵT^l|(l'cZnJ㧷HKrDdmPCe 6f"ABQ5蔭N"ڛni-,ݻ?FzH;[-GUr 1P6K\bmDE&aKUmłOH-}9} ?~w9.3+hwNG|^s'oFo$l[ɧ &w?66MXf>i\ So}~jP7 kXÕH_AsQـޮyy_=DZΠCyso?*XC* AXF$pz~Ge@xAxW]T&Ə\H1 (J%i692@ RXK&o#v6qڶt=/'Žͤ_N o?VB Zh5U5yAaHK5Gy%.# 5WOū6?謡ثMvo(AW6o>^*Y+>NN蒳Nl)k>"l BI fflNm7ֻW9YF^߇0/O Fo{#6`φdz xyf )=AbTr䨳&RC3fٛpq XN&uMسկW&=}bYPyK\1DOYBpwT/'$!O XҎB|.{t2YU cLXh;fmeq [Ϟ}=} ;+؊VA^eQ_(GtѮ]}lDHK)#@'IkHeZPH*"iD58TC@AVVrW.iLձd@Pt՚Pѥ-X/n?bp߉>=$XHk1@' 1:ycf*-8BQeBvW2ޒ[o[rxP![~$uůtZ&'앞!3f>!{JN:ZBAbU-KE隩B|"(*2ҹK@ֲyIM1ĐLkVHzvl;NJޭz gjPK^m7 e*'R,=Q:PVS"Dx<qql6 ZN ]_u@y+dfP K硸9̣efhR51rNxSbpǙ@)VxP5eFUtWcnut2Tiֆ*1GY$y`؇eM$i}_iE+R2LY%ʷ]Ҷ$Nj$(DPQA)JWx3n jYu*$[=@.Wh;JT$}/j-[/#m6gcM 頻 O޵ۿ2.]l(^/.B?-%)ڋ!9MmjE/8ꞪS%S4 g4j$"ӉuF]5v7w.y$we1||_o0=1L)vR+ϟgHrmnukZz}n )VkDH&O;2RrtOt_4(ad/^ٳ-`%N׺ M.i>#E׺6ɿWj5GFw74#~m6lvurBj˗œQ9;tWc{m6lb<=Σ犱HtN͸]W70ĔG0u'/=jsx29:[Vj1FNbF^HmXC}~Ob}U:#~y3v|dmi0^"NX)6}|1J?ĿXLΉש+,A0*(5i' h<9הNxħ躐UMǼrG0᫗?~U?7_ryX]"`0|e4[6-ZigNg;v4=B,8fDk^U][ꖯwH7%/(gL?@dr>D)fW8WIL1m! Ru:]g2hmuX,#^n㥳a'ý+hmcGX)˺Me*jO`:y<-3ͳoggO67&ؚS W6Jtyg3.W;3nu&1Uo$~yՔ$ 5p+1 |PX ~e5Eߞdy<^uPNh2G#NG.d5f XV)iF~+[WEFjO$R +Zypf"Ys}ê-#xL+K9hAH/dH2M!ʲ.#J>brfdyz5=j4W bw~T.htkɬyp&5]~8oԤq3=(s9?\{w{f{Uf1fÛm י8{O.ݿ' oҧNiW:{ @3\ܚqNՌq#mqw~Y^k ?ijQ7(|N$x<|>&?}!˅ \d?Px4x7Aq~g9Pכʼn} 3 ]}/DvoT.ܖdz'KξCSWNr!c]@%ŵZelZb% 9Ղ[^\JVAqmAӵAQv_)--' * pՌ+P|@`ǨO.5 E'SOaVO$V,%UGAIA\uErޔ/5S׹MS-EuYz'L)#_rJ s>YNLŒ5^YnXLk!r[|&A+9emW,ĬT/NN|]6t)vɦOjҶ%2*ʌx4@crL%K$ӑt֞5WVeիo^%A8ߠ`65[^E5MN8i%<0(", i5 ]G-oGo1~W-IT|P(9p;J(1K \*|/c8J\*ֻsRQUݸpE"\),1. |+?cL#;^f BeqrL4 !qb:ffirͷ-s\H/P\8m* 4YnKSf1:gePҔ^3{(gEXf.!EJe. KQZB:: ӛ8k [T\mׯSd!hY\'#w&&})hyJH-QRIc21"wmVE.VS^D 0LZMElP{~Cj5jE|pz-k{ӓ6d%XFE4Б hs-C'gmp|=OxT[qAGk> g2Opa q,DIlz"͑|RIj1Qcm* @O6ZM (O9n`\+_R}#co܏qް7 g,4;,Xx)Q2"/{O՛^Lx=t;G쐽* CLNI e4b9c0JY$u:%dM$n9.Z8b%41BH̦iͼ7U\UNq.bO[a;rh44+){6;N l53g}%|}e>` mS1-:}*hBے{kDq,db&r/Uu! o[l!5%z}^69T?-~\4 G)\ȘŤz|D ʺ4Lh'I;a6r"͒%nլ(%CH"&)č$ bi>Ԣ]͵l`oT&(֎BN{ *XP))mI+ZTq|rr6ҥR)ih9#DA"2IL ?W>3VMB$ݧ.'B2ƌX0 Yt4c#CԪ"֕gDG9Y9%>QRN]扠 JK-ٮfR m sRwImɊ 2* zNv k2N)VQoa xLsX`t<[ ׂZpi *vFnaPIU ,*P1 A_$Ja Vev LIa;&AwC ` LV[@ CާDB fE@RW7EE筍EĭuSX,pj`&u9 JE&(3 \(ieU2% na5e `I p45X #VFYB(h(0Zx]wEAH:=X\)jWRxՄK*,B0F!<)xPiD2ӒVA0ӹnau'#xH/j҂X7?{qd v\Nb$0)MDq%s IIKjH KSoUuS;Cl76a,© "tvgE3@md-b$S雅U`o~=‹Iq@!6*Kt]p_>LP=n;p >a*@-B=6#-ѧDkhg];V DkP({x l(h N3ڔ!6VtJ߳<: 9:AY :c~mTgR'1Ao y0iB]$IY6 l!c#(jS"^ Į:t"IW(I5$\KxnmJMKKWoU4"EhIEvloؿtN6 W|Z.a*t1dk6m˛xڹO/6/,fCy5״˰]\G$YMu XЍr(urL#FOB=VL`ێo) NdmoOm5Ew)DzIIb#FR r|߆pr0#bp`/:Bmkb7ԕ.amzN(˕,]TzUb.m ՀC!k [ƮXCCV+ 6b]5w'= @+^@0 ,LtC!6 E`$ӽaͻwuU+~(])YjhAg` ĜԈ^!6i]`Y͑@Z`Iw=DIQ{F$_TL(*ɭ6CUK.d瀟uN^n^Y%iϠB]5jcZ8ScUB@pX >Nj` 0ܲ¦1f~|eRt/$ӌjC;F ʼnrR쩰jz$PrmX7iܭB0\x }iYXoCnRM5r7qi$" J.pѤU "!:mv]XzM WK@$5F(?Au 358 N!Q~E8\<_\+ 閼΃ č?$, tr"X0|O{ 9)?F!3e.:ͦ;߿zm1CݵÃqx4<֛A1sA.Fu ]"H>Ð4ARܙyo&bDx~'Y'<)Db SIFGQ}2VBm:.j`OńS ߞ6a9}z ?tWBnm1/o{8K۶OM NIFۧr.vLz9f;,o_qB0rۖ|Vp>ooWϼEy&1qMݧ2gOzKK\?z辝oc<$ƙ 3;F,_)g"t0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kt0Kep>&a逸Q=l t@J9txM:?yֱA04 NbO-7܅NdoO6=QXIı),``&tε]Mt`ԥ:5GڤDSdOHT18*E`x@xWgmx?v?)Co{jN{k{QZJǞ#t]#J/.Zdal m>EطzRj}v9l21 ķt1303030303030303030303030303030303030303030303<_[x~<H3t;99Ɨ5sc$X)?ޟؿOvXʊ.>`X:o] ;9̓K0|d"̈́k5jx݊jϕNL2UU;jS4l;8Ikɧi֒O&Z[T<_HMٛKCgeqY8~c'ՍEXmxp5VkNZJ4:4Ji"^ul=7CZj>FdL1 !eԵ^Eudn|aPοp0$j˽|qoƣz!]x!{!6/bvwVsNkʕN,?_<ʶWc2RW[X* b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,b,ϖ-,|L,ֱ丐CI*LP PfJBvTr7WsV'=0ڽ.+vtg[tJ"F$곦zI GN&ѕ_~UoUמ磇-աe7/` _7OGe]#*_}яe{A ~x="5XGúFZXXHZuYמڧ2a\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\Ε\ΕO)nmv{|Jׯ%ׯ?]ZeXE JSpy> 0x] 6K}0W?j{wpvчwc~I;qSNժN|`zv.tCP) QDPRA{E Tu1t;&YM:Zx1Ɏ{GoqSz=q2{Q/gqnױ~u7^p[ 12>YR14g}NZ.dG9JbJ7KֲG@RCr sq ,OVj,FTTTAZeD 0L11L&DDf7{˛gçMq~Hm'W4ͳ#OC־Y"|:`C<|V;lMEi$]i^ .ITW&bU2UZﲕ4g~~z<5]1 +)F5PgZ36t%Ysy&<rBz:-8^~e8ދD))eUEQu"h:_*UZVW:[dVuFX?-6CUP45b -(^‰}>>D{ݸNM zrDc[2]Pm%5BoԉJE*,Yf7~{7>XӴ3ˋc9Ss.xv 8t )9}~xJJDtK[{)_vжrzӻwzӷ)qv06!yV*Hbvr|Z6|o)dç;lh|?ҏu?EjGߍ_ߟߝem:Giq7>㸲u\~uqt@K^PT=@ZPŬYUr]:gǤGpDPlLgctK8~%!/`ӻ^嗭Qj۠zZ \٢Zȿ9˸_MLv:6 Wi ̃/:*/ //< rBtKXNҺjsGRv>DZ-.ߍH6!Gd#,sϩaYYrޔ&42_ FFn[Ah7( fO7Ҝʛa;DI)w~cٚ?m}0kcyZf>?Hxe&HL74Yoq2lXB, >_O3" N_{v ;]1k8;_y VY$(CuveVv%0gaTこ˖/ZGpֺ3Qѵ b>}j;gz}Y+D݊U4oZ(F֚oVojY ]Bk0O& XXS|zYO]M wήʙnYu6ڎIaURK--k 7%1lբr׮+;Ii`鯆Exwo#tqTqzg5Voπ}[]hNz*?Jԡ$nSTjWJliSIGӦzwm*!epܦ9JS/%p[)a@(SD${~34c8,$;ʲ3d*#1)D-OI.k 뮚Z,:eeF4`t_tcHP:!sRnW:ɓM/l1?i4? ˔!lL/U}^Mb)3phls&MFelA-R>ɽKG[KJQ&cgz˃$!mguK՘e9sm_v/E&0iL:_fzk[Ig0}ϡdY%۔h$m"_N=b*ŤDH$NH {e9N.3=\?^IpI(_68=E6&X&W>nl~~n{j5ވ7?b6}lnSoa~S8חSG&n/lom?܄\u[c$W\^l^@T U ; +[[ ؼCo W(W\pj5:@efUqPg+lyjJrB!R+@h.B*wA&B%"ӌp5 r P|2* pJ)QU-zriWJڒnJpuhS/xR-wAb-4/GMqdB]O)ܨJȁl ctXl "෸,|8] ϱe -N=V5 Z@FCȲ¯?&R@WYUXR<BF*z c4C‹b9"Vm!b$b&x)qdL8yT*)xv{Ljm"z.}|Tj8|/'7v=SOd7BgyT-3q Ȓ4.Y(0g[/1f@iMQ٤TYH;bZbJ)6ӌ+@`&\\s՝DvUq@0k}Ʉzr5W֚ Tr:<#  hp%pZ|u\J\WpIhFBd+PQ]T !]+,Y>P` ju\J\W\Pp]\͓ATklq*UqP Z @B]gջZMBRd+T.B 2|C2,zsJqW+,b䶝 Su;vɠ+3ТgZ2 V j Pm[OdW='k@P6\\rmSO&z+Cpu>B+PkhATɇdwCX\`y6B+Tָz*+_@`4\\M2j:P֋+M!Y 6u7 9\ZN+T5ŀ07+lfwjE JKdL2•es @`2\Z;dT]}C,z{ |W[vW{=e䭽>\S;wek:)p *BsB %z}]_¾rM0Mq:ֈά}Q+NVw]\V@2);^[ڵ5ƽzթsl-:Թ|VWȖbVv]zaӧO@S;z?k&\Hlq/!(<mo^H ?B*7QX~dRp3<pw^vdsG9ۇ?l]Zlv]Oh磿 ͷQjv&jǷлԁz}A튮Mlx 4ܾ߯lɛ"!ޜUq#NIq1qc(DX6&^F-Kk"&OF+KCd4z*1?q#g  Fr*.+lM|v^<,[iui]nϵ2/v;/`6xڳE`<[f#u7cwWTMzLޕ?O'_ &e{3lϕ4OWb>l|C WP8i `&+>y;|nLb6<!SX~=hǾo(w8}upjq~vF>%=Zpz=% MבrP .fԫj W3+[ρ5.G7VW rvLVYS0sFN5Ը)ର@.&p Ԉ>dB! %#Hs%WRTo|¦Jj9l5OowqyvY^7q1CROT0^ *-EL9HmLGr^3XTLki=tVG<.%eKu1$!O$J")!gӒBBk|be&<6e9.zry6>2T{{׶U/w bg6)罇;fE /fm*sJ:ts^gslۿϜx)Uh {'i>5lbJޠ3 hbU}n9=|dK OA,H 6ְǠU?~3BtɪȕG)2M@iZxUr0ZY|IYH Q ޶A) ΑsjJTȃ\r*\15(Os؆gcr^CWٗ]2p -||-`]w.}(bϺ{ow_iSkpԆ(]T[J͹H2HJk}EJ>5eZ4Z2tKQps[gMee2WAJkpcr^;}7DgeړAo˶~ 3㞜(+̲GW3⛫ V ~:^/(f9W $6. .8(U^Kl 2 e?JIPJ֝ -o Z;zYm/ ;f}ۏwR5T:Ϭ҆8# (Ůy1-I\`q=t{()TH Sx#uHd}$AQ5J@i26&e26Ucacq( u,T ߔExmktg%6ViEϟ?¯ 7n''v'G`7'L9褜6R4$P3K :Óۀɞ `!6^R :AnGMLXpX*צAtMylR̮P8ڦajڃ}.-,J`ɇdZ'K#M'ܗwmK_,P>sElo>$GE"8e VA]$lʦLWGWsI#2p#y$vIZȈ @+:pШk3P1`&(5Ѩ Ff}X0 #dŸ+7kĽF|1)bwZ C(,[A3Thy "i(-6)8Iiʣ,e=FEѢ4!%8ME`>5bolֈڣ^Vv

RS]чqǎö{'Pa*Y.9kFo?4:zܜ~Lُ8lt1=9*G@DL'$q>vtߜvf Qy{KDFj \44e3Ns 6nvXе3cvu ouȵU智r2Zu Rx+uغez}8jC6덌ЯͺClYbο}NW7wVzݢk-C-Wϼ orC<7tܺ+[O M=~t.]-)ofT~s嶷?mn ,3| \Ԫ?X"S`` +;8̎nZ6XǢ:P̔BFZZSI)ĭ)tn7=fn nHzDUq ֩3վ' <> &fx4Egw PF)ĹzGkK2F"'Npolg<T&F+uEZ-}gFΖSt_THȔ CKk]|V&ˇXȷcV=s\|]n=PGAr{HcxUX<J6B2ΫUB-gW1 \ aZ‰d"΢% (.h6 +5C!㾃14ha{a&۰?uXOJI!S,o2r=Y D6)޳garv;A $4!I4@չ~.9ϱژ\$M3R_AzH&GƼF0 2PBP@]d݆%<`g.ozy&rlMWg#(&~~߱Cʼn0 ŸGq0͎њˑybrQ Pߤe]xea.샩?qݩ]] /|<21UvdCjȾthrVS]U]VsTg0e{4Yy!(+H6\lKa8CVw'`\|I:^/!EX8zՊ%om& gӣJp­aت!B_{׊ӼQovrsQqqE|t'W'zk9|T*4 = _ '14 ͍fhS͸G^1EkYUA 0 /}_g QptMlO񄆒J 43s>hHH}Rv2`hx7VF7>ǡ.e->%C9w :n 财&d0&!bG JߤE:&Fdt:ة`~>ݕă6Zo߃*?4I$[-}gIXž&RSVSZzgK/T~*z ÿ́fud8KIjqhp˼jLT(c# 02#wJ[Q3df^rrm4DJR0Hp:=1BItT!cYH#,H(D RGMe,c>F-Q"["oCN>20B^29/|4MUOs7L~STl^((R/s_E`Z} /(xmdJ|Ptt2k ^v\>۸=.ޜB{*DN [vtIa4 =\ֽD/u9\K}2C8w3|t>*j/\QԹ1;||}ӿ>\> \+h ==j]T I7B{ԒR4djN]Bh K3l(p^2?DugKȧNn#%Nz-t.n56) !_™3jF8}:7muSA&b( Q3(‚R$Q+ӂpB[!TmJ~/5ҹK.iKJρ;,D< FB$ Y"~GG eԣ;KXD Nj5Ʌ4$Dǃ^PgͿIu~իob¡i_8|L+> 7`㪇y7;pyŨ `kyQ+ԒeR[-{H Tk5ךwwt 5hki'4*{G߹GN~gXAᐤL^0?F'#i\T^$ZvN|Տ`G7A0)WGpGR0(9D899[)gޅr:|]* ; m q:ڭ>NsfYί M啽7ҦdFd5k\y&LKPk@Ig4QKC" \)I4B喍wI1bZ F{ ZBGwii5vl %[|yD *AúW@H\CGFrY$Bk$ 'gJ:rT !oBtXϊ7r6+~k(WB$N+~-b)&)ʇDy3Pv'W_ fh .qp4W_F]$)QM^x&iܫgy4ͣ B# ڜ|0dp2(sTRjyz<0|V|o8>xRv&.QJIXQI 7R+"CL b^K.Vh i/IfgzZmG!MD m f''ԇ6hOոni/U^'b!*),P3Bdk$Y xO54ۛP)PB]r x#DX5hcIQVDBp F„KI//@.,Dz Wc͋˕w8y s?N޾j ZRhnJ-U\q'9M'; \"ee\cE=Q҇iJF1C)v<w,-+Fz59I&+aŭyDI]?A]7;O~MΧoqU|U?T$<ɿ\UCSdKaS*m_Ek&@̈"G>D=(KH""T:5]]V$`b^' V'L7Ԟb,w:6­Ǫ-W׬(՗W֕ԏ0dhMIYr*؆6]jXN9erIJ>kMKܖ6 wYIȥHqNd5AdeeJW2$ľ[&^B9;-Jgn+o(Sڕ:S/eB#c(!X_iCԈ뺲nW| ݢs4ٴ A7jdIx*"cfIAR6y+'e)2u6%U5b!0Jtm[3 v20 ] ypRNiSғ1EC0a`jC^joJ\yM:4h<{]q:I8}Jܯ 6 “tBvJ$UKZ?uN%}gw3f՜ццB}yr:Ѿ,t~y6ؙo7| OYl}{og_jkov|ϻrX< 7't}@o{Ч,Lףc/XVxG<>,E=?.Ozs?{]x^`4eyyҔA}C6*9x)\9%XMҕ;vpEbH!cɢ.)2!keƼ\{JQlk\;YJ8)Kiq]Wvl}>q g; 8;NpU9ٺ$͆~tV8Z SB`5nwl>%7}ȚCų߾^=: ~[_t: WfI6C7JQ|Z+|8A ` t6c,kLh5۲ty"*<.d02&a-ɶeڹBvB0uy=v+B>]|N1$%cPd#\Qb>)Q&E_˝a`W]fkx:%Yhvji#>nuص-aǮ- '/g\A0x"†(7=4'P1@ QkZC- Q:0F#BL \!(9y)e j.FnmȉsnR0腄bg z SN#cvۙ87)MOp~.G^}}>g, gd<4='yXf6Jb*_}$[4k\6U0`hNr,VI2b-XO5^+P\T[7`FG/UV2X1pJ_)!k>&lTsZ]&Zdw%‡D`(0B2@BLޑ)Qiq%'7))R$R ],ŘYe! 2%BD([-=GĂv<1[;>GCڽ| ѥQ8Nxy"6&衑", u|=nXNhvP).dKo\qyA&i5%sض)o.vȹ҇~7L]qDTqKLBk3H J Se"(t]HtLjWWf^J{TPxoq___/!/& ~:Q(A|՛oJ3wsvp_}F6RS)_MDMH7Ī!e }E,6h@5Z PnS`wLfke0FJP7bn_cJPlO."\[>lLh 3;bٙ+!v@rC"%b?[2!";WE\0WEZ˶(+\=CRLsK$3pUUlWHk+.+`."0U׈]"-n=\UvJsn\(v]q*ꭇ" p|pi.j'N6[W]"WEJ+\=Cp|Ph FdZԒv~_͒icQ\W wQ ͎(}clɻFHג5`u`, Ë?iRmpZ1lMo2;H`8".ߙ0H{nssGTuY:"a@)A6}Nfw}WsdKsT:yg+-mxAa.1)GMMQ!$nvFҧ뢇|!З;Z.JAQ>wrfyNZgiao6n_<37޸G7ɍTHʜFhظiQh!-%θNsS 0f*2&сQeΙ!L 0O nRo\˹MIM*i}U?.T^_ǩIzf2e`R D32GQg 20d@k$3(S# tbqik$6ぁ39D)$!hS2xp WHWr)d cng<9y+shT"nC$RBg/SJ PZ!3A'etNg+tG 0,z:T_\/9v)7e jFkA҂ HĈހ1ꐫwf+:_m)82%nEҁeo!dG| /=8?r|M_ۑ- 4c鴷ECD렉œ Q6B)JRE/ >}F`Hr9 % !_t _&9\ R0j5Yo}՚QEwRR\hJbγ%Z\QXA304N_y1<bǔd}#~nVy"~IB1ۊg{h|!4ZrCo޵q$20P~ha,|NrnqQ-qM ~r(J&%jJ=-3fUׯSpp ´MQ!K1Cu"4 # &w3u1KA#32m2.cN9"ڔB T˽*\k˖܃+%QL;9{D!U, uRJ $Ψ v([m"kD!hh $5LYB=GX2Z[U &F)JY^XΊ5r!MZ-3KAJKyT.qd`9剤q Y Iy4&"z}I"#%'l2>04P>iPN@?xFRZŠV;;{G|2%(fBB)Tg]`^*xԩZQ娊TMZ qPG{xb=}BuɣF[" :fB1ƉhE.%aࠅ,q")BI[96CcIc]>, h-%/5ΠbM@p8@ JX'c 0]t]%:AgX?_ R7CQ<.v$t&Ol8HZ/6T8La8hrEp6w:zl 6*=:lQ+{7fԮZ$Ff&%wyV? A@"X>6$f\22xJS!jdXbbC$8yM\J8.wH>o6 HJ Sk yJJ.,Fݐ.tH6ҭ/zŸ? Z;DG1Kˇ1^pZ`Q ݸp >g݂Y5$"2zkZ"Q.-}t֭AuC)u#-b.SY+Ą)2M"C4AyX1 i5KY!)qZ}>3ܷi^nT[B\!n6=٪"?|P@՜F-NEM $uʠwXJ:28: $pTkdPܡP@J #n ^>x8Mi{|.h[β|"Rs)ޜĽpB&OxW#-+Q]{%ML,PwHu>`QBFZ[SM)u!ѷ] 8P+}U>"dzBJR,h)=MSt6z)K1eH{)4L2F"Npolg>I׳].PbC4_GaLf$mԠ1SVSdy<&xq }[/E ,Tc@)ԁhe^fEE0f)Uo+W }{.k6*ʾhZbtB4N"GF?~5~L+6g+Q J5#;-FW͈munnM^5dY^s>;;-Wk(BV86`}lI3 ~MӰ/,oP# +W1k.jEozYqVFn}MnzWCY'0ZKd=Ni5v}('ˇ60;NXV?>o{0˗m']W{ 2bۏTFż?DoR#>]T[^U!cvzs_W߾.|?woNQWo^D]xBD&ᗇ@GSϫz^8_;57beɷ^,6꒯ao΋l!1n@`_|}Wm[mh~b=! 7hf4$|.>R5^1¥d@)D/$-<"%o3.u#>סx ltZQgS2YɈYrR%r]dv;EvAw1Ltr79i b 0-cp"B +?Kx E5E% ڪw6g%.Iߊx`]TSȐz8;eD8CC$8ue o &`1DBRs%B$O%>i˵)I!\NOPE#DFKW(ENSB1V3zGPp(D RGB+X@+҇越SoE[#_MΫ>P FBUynq,}4dRկn0iKKۿVqXM|Mģn<Ÿ_Gm{ >@Zsw4›7٭EMgj{ȵ_v!mq/n-. xuX^N6#ɯT[nJr#E.Ga; ?4JkkÁbav+;ɯWy!.d]N[Gq\_nuvZkoq<$VFR#fpsW [ْ5ۻoA}l}NF~=IHlEɄJɴ ,JɢH]B-@kV2}v!mI+)&i2y^ ϱmaJC$Hp ҐDA*_d,o^84fy40{Oְcyզ**i $\NXclQw;}B##QWTV:Fw9LJ3&~>ri6'Felr}1*}>;,#z!JpMOx꿯~ѕW̫Yo . G<ɻ?]b 4q3jyZ_mWTyZ⍖́y^Pf]x <뗋;\w:aILuSU+kf]xujk7+//=6,N~Z|]M.|v|uՋZ]\|߷|ɏ?y8ZX8s;ZLG 駓rWio~:ot!-G8E̻sw9MLCb(iY I<|3ׯCs4j8Oe~e҉yxpIeCP0ɔE0lBJ} lizT9) GNi b R,U)y,JMA&^3dm|TL6B?1|p>'*&xޒ%2d4ZQv$h:&fFe-ЫCZlldrPQƬ' N&A.6>QY_b<Ueo֫l}ehg')]2 $3Lz#؀. R@6BLjln^HL$@Uj: vgeKEDt ̴Rg;b<21FSQ5Fm7`7fY|J**1T4KLDex|L&i02 2yM!3RPj#ddVeekz[!4flި޽+0GD<D,V8V;FiRB H!y6F QHCSDN\ml4H_,JHU8Cf+٢t1%:*#?U[NAu%OE#qqM#>&Sɀ19K*$Ģ)b b6hmP0 Pqq8In{\@)Hu. I(AH$rlV^Y59vs?p\XԵYr"z .gQrji<}"T"YtzǬruuhYT-58%W!UVUFzp7\ ;u0pUP=vUի++>\U=Z{(pUuzpU#\FZ9i \UuUኵFzpڂ9AUUW1X V-Ws}6 >`4y|Wn6P=\jo$~Cg{cDW - ; :l(߬~wS:=a'YEUa|Y}4.EZz$$,d^%7<Rk&Kl@BFku,ghf Zڰ)nd~zrq{]/G{/Ea~o6^cQ[nM2Qb>Y:$unG+cR֌2`.wżyx/x[$#|!R}"%fh p924ZNQ:];ŜlQƋG9ϑ~~&Jʡx/x4+JH rePGKD+(<:MSܖ YE<"*hb钰Qg3uvL46/C"W7OK%>vݲ;znΡpuۢͭCߒob[lB'O,hgT0VZfCeAG['ZbP7KSQ`qH>xޒ%b[QFU7>@1Y"-͔@^ZlldrPQƬ' NL"DZo?To9~}EZ󴍋֙hXnӈQ:&D-:zHY!Y^@L`w9)$)2ZWk"LER13'7+Rt<#ѡ0HFflGv\6OBlv½bᝢ&WYu_qyj pv6|v2r%3 @: F`]@֥l::9({!1MbUa,F',l Ҟ-1ѵvkvĦyeb`11EITS /20,% F&A&Ab;)c1DrFjJVt5u3Ъ\Lj;JFuL:7꧟lv /7#"xXpDvҤQABLldS QHX~^gQ2ڏ8Ao2 bII,)FyI^ !i%IYCfc xxkX jF~T ̰gǾk7GD>HjzCQ%0T8͍qs|JA\TA>ik:X$rlV^Y59vs647XԵYr"z .gQrji<͗U"T"YtzǬrGutܺݢfܝ-Bl62Nhsoe-W{Խ$4.|wɷZn5qyCZoy1cȝE2r&VveۆWMG /LbMwͷ4/Sc⋘6/diQ4WE[`N kjN$;/t"+kl.6ޟoDhDzF>gk>O=_7^l73_3zK1H1R53IRftɱM_RI)!٦[ R/ƽTJ$ L)+㑗uNϺzZ5Sg-q>]k|}Rf[U,&'㥵Ox5̐@jG^k'ʒGOH˗2 ,E,RRcɿ20P8ѧ55C6sEJ4,Q꺺S1 `t́H/:pua ;~[\{+(b#pfgL-D)1YDN4+.bk!qN}kX**:%b _ fQ{qҴK^OTNWOe2ٿ&b@2N!0E;BS)(ud;<mo%P0 0 1C4T\:$UX.x d 8?', *^*NSa|pYZ?ldJFRQW=^ɉzpQ`PQXJu[)o"ҫÿgpV/B*1ɓ;ս{23ŸelzV?8^hy̹8ppzV-Ջ]2?Ϧe3w#Ц$8g:4 J>&]k-*SU8ȫ Fg@$oz?o?1?pLK6 AqGpc55LMfj[ ͼW06twK@x .\̾O,74V4qby %C+J"HdX'w[YZYA@g&&#(L[TbJsҧ84ܥSա *hFG#S*j%A:pJ"b@vI h/qo6xiű3 hWnZijfxJPΎG-ͽV^:E/1L)11Luh?-f H$${[pا xWnz‹ы*D)u9z47K1G?b|({z_u5e1q.ň=GA+}ZhAI('E @ Kd68%5plG./zU0ҋq1 ԁ} `JNm(z=gs8y,hж(cjnx^zT~o&Ȏ_ʪYRYMPijU5j+栎 R^𭗈`4! pp0=@'L\&|,#te!r)UƁj)TxS< nFMa䐤XHMLc1>"'^q,sѝ@ֹ8"+t:vЉ^8C%>X#I,MYӖ.$cIS֕ +鄢Irva>d[r]aݿiКTn{16/1ΆYIq *F:sFo'뗁24}a\d@TR:uZ܂)0!\Su[;Ux.`,'@:sAE9;^}`mեHG Ң0ex&%Nb%X 7,Ub0]JZlkնj̻لf2' ނB b* P,G"4 *hFDw8wѹK4)M4ň1A,&qtEx"J<F:[u* p*|XEL )d4&jcij?%]4()EP`G=ܦZ:QJdxqՈilhlue3N3}5Vɽl6iK^ fo"|zojo2`PL5>&(nyikb㦗9T"9l9\\#ԒpVmaj1Z R[{Mo)R. Q޺HU\K9w=LQZVKd%uHދN4?΅4=Jjg s֠|9DA~{İ4z ^D.D$WYIM&k#X`T9Fǔ+냉Ô0+*c"-ZgK3W塴KXon^w>Mۭpxk\byߥڇ᧬ K4HN X F!Vh qD(jg{$!rƻ1<Ơ5\T1kRJ0 Ȥ SSJ,H "G źUƀ%T8 %!p NJ7bi1gtGZf s7cȭ{w7PH[z1Iޝm[?Z;UL6CDRņ/:Mx,G~yJ1(0$9p\bR&H4BXO;Old={qa @h"T3܉ƈ(Ğyn0oo8?3S=J1V #j9RTq)w>ES΅L*{ow0:s$VhCpH]@K(j[ػl +ndKRL<pPWٸLfi˰*`+H/8(AXs­Th9x ;IjCROCҶL{l4+2rrR/z˨WMK~ߙa=~+.{f>|?g3**@b|zaޛ2}χhf%Unӿ٣*ӸMSjˊQ FF`< lb0auyн08=Ÿg=yKMKz&rj\v4.+i7a{?u#x6CY+sY6(HxTF:Rqk8c ˧a/~6.QbLy  M޽W|vA  -bphI,;&!(g擣J`HR MIVUYyF \uJPҤ[(U]q)^6c 7-L-z("ƌTJ`2OQ6M8ʛђM_m8T_b!VW{<]Ad}{=Hu,ĪSz[fPawiѡ[~W}M,^w|bpAܰ@u{qߎx0DM!z>4WLԘ !;j 6#{C?5'gHvSuڻ.4(F{BT $ 1"`̛h1ThHt[qZ1Ov,HyQG-Zxg+PjH`!C)fp,MFO@R:H`14NHʤSWZÍU\~aW=(MrMina(JM{mݶ戢DR "hΜ'WM 3L徔:ڭ{DzGRx ,ϢBLD1%JQ˂2IAA+J 4F#H*n]iJJpp^#Z)  !MyGcFH3mgm Cw(5aMΗAkt[, >"#%=&YJ2/i JS*, #4#nҧ34zs3-_ޔswv[=FD$BVa9Qq-@"xkTZ5$9Iqq JB""MT) p)bHETBX8č-YklIg;WՙUQS$1Oa";\ʩ`cB0!jNŽ8WCS)͞$/kF;8KeA͕DtTP<6J"#F(]Ybf/j&jp'Oe1.Jy r&R$m\3U ,  Mj"gHEq8vʵN<ȵ7/;ђke,E%୒1A #b"A/{Ƒ@8@6BXٸĻpAĈ"iiKYK"eZR? UT~ ֮Kw|M:lPކ^Q-͇+QU-yh{{Ke SRgXYz🇿zI4[+-0 J+Wjr-c1s)pɑp[09sH[G‰uXT[yTw[XO[Cz LLryå O=cF@̃bĜtT#ۆVr3:E=i~߃G  3K͹zͭ÷#iJ ^ҫ៟l/Dh6}B,[JL.0">}6Axsjɸ٪5_yL(pO^SAE;ǙtQ(PDȚ&yHIf mk#1&SH>?4"I|tq%X4`j.+** ԻD.OFz~D$!ӄc8N"Fns]@hJc|ӯ*: (v|7`*]g%B.|B0OtG\{re̵*!x$m`7']V8JI6E!4YAJ>"21zC1Y\oSTV lRRc^gP2;ʫڨn?y#_oiuG]bXLJ+(ǛK5_dR`*Z.c'Ҙ$?f0ewKNt)FÛ㒳ɯЊj|Q0|0|TF Eۏ),t3:u RTƺdyۿRgz7Cf^{I<70,Y`0,P$6?5W'&+biL+{Vʆwľc{^t\1ue]u/O 'n܈{)B-6ڦ̫Y H(򹖜Z @ W=:(8!rv9v]MٔZ(ڜ-rm|!&l`2^NG;9&\{9K o.[|o6},6r-)N7aH@]S߾seQj{(6`yN J!۠L7I1i-moHNT[m))Q4GPi"8X!8GA2ըy1`k U cx|rVk/;^sXe6/6F[ 3qY,(oK*&}wSd'uwE0_\l7rsl-zdZH}t¼aκ=~$򩞐(EAA6ƒEiCm 5$;DQ! Q@1@XqXMlJg9&r6ȹMK Ȣ`;X+\$JHX*hl1rnB !;drƺc0O zPM>}<>`LDBO%$PJLۄa,ceo M%%|> %J٭pܖSչ8F#zIR-Ieڈo]jk_T?l6^ >ZYK0X`"k"QKiJĨ6xΨjgo$OހxfN8u g VyVyi',q=5o6c7e_v m~}v{~n]Zw7i=tƑU6T_OHE]Q#s-A"Q8XR9%ڜ{SaF;&<7mYQ`I H5SLé'Xx}'=EgwmejXb=`YEBG|U+.`5o]͹F|ԄCHKY890V+E$\#W-lg ?~zfb:I"Pa43*T$H"O)cdNX @-KKE4os?3  ;k"c&fhs܌GU2ov0=a5Cy0ll5j7xqlj8QKs Lַsbc}j~,yæ3}z1HN BG 7*:`"JFpi ,lka+x.L ڶ'Xn !-TDffEwZI%HDUa }M{"썊s1iYAZ|o-e 4Q{MP)?6PPVDV$LpZbh2BWwCַS{﷩GELngv@$9ƽdbK0_l!a${kۏ525JP /_%C^;kj)m.e㒼w.#>*w~d)Rv((maKA)EݛbQ꼾7 g6fq!rr ų70!wV'E:sZC/M60BR$^jԗrAQD5cȭ <L m>a-$ fP&O1խy2λ袾f-!&+sn[gs;g V<o.Lo:;aI=1Uݐg >3f1^f2ŬUgt^+A:^WU!"DFRoaW|\d >G8W'KJ%S>)\ez5HQ!-8gkIU{'$OWUNNwWך*qPp?!vzu $dz޼}SoO`9{o>{ {!Qp)e nL¯ 7 NITuMuMyE/h3NwW\B`U1Xeف!W1Ց7Ilz~w' !c4n=fQheLL6З,(9hh枷zҮ %/}2:Zv2CkRhF lt4;"v* ҁS ǔ:j`{)}${N#:Eʨo54 dG 7gӝ=w*t5óV;#8 MkdržJND\ОMTs!ڒ*<^q!CO%RX٩~ di}cj ZCC #52(3 q`:-R > CV"3jM\$kl LxE%S?sSoJbt`ozF-']Q:AFc_Yǰ t!m Y/iWzQYndz/b #˗Ss_2ϊn ϽpXZ!{[:|Yu(9<O'up:7w;fnëu:UX=1W0a6Оnҳ>B/S,|Bnߥ]<!siN#iMeϡX[|JuT脢"un]ferT>eQa>PksRGo!h~/?㸛iܩԘ\1:m~Qh4Pg7ύ3xs} %\pD1fK<:SR`BB*—R&kknVE猌K *Usj79>l֥¥!+E8m IYI 6==􇯁>bavȕT ÝW Kuy^]+nw1XC:3>OnӪ,3nՎ' fu6V僞 kwFN(Q $BJEB Ah ko|BlÁC@/]:AW)&i&Z $Pβ, _!(eAP{G`g7~6]LҕoT)+|ID)%PL| w:~L)Wu8D[ R1jb{434Ѥ `}r:ԃ/2W\/c;c;OhֵWclT}ipDOP@e lTZFHuE<s* 2/Z*7\o2TKl'uGqʣYJJ4J -|kxuHMS7&*0v)lV ZiH 7mTBl1ju Zp-xmzUrHP|"SFOJ2DVdؔikDRcEm&fEIMҤl.gv%R뚽zoi@Uw>>Ž =Åt9Wwg[Jҡ4!)oщ|5}f;5)E#fWpQ Y'ÒsQ|ֹ\T =\rsQ@,H@( R8Wif]cARlʌ߮i~|<4M-ޠݧG GH 1ȠF A You9 .j6zE;]{!3MbЪ¦,D Xض-HZWl%fĎ|n6;ڮ1j{k%IIEV%傱8 b-)Mmd$uf;)c1J`+:iBƚh),G!A"ۚlT{M3U`Dl6?DY bЂlfU-ԄJb!{6hsdRYLɇ蝬4mc2/T9&dJ(]JlI+&/%$S되flFď_ud\v کV䎸hDc\.T7 iL٨RI,)Fy=(B iArŇ]fcGpW@eb}M?Mg_#ϹjF1/ n~|"Ga 9 $"}@hE%0Tb,tsknwrs,18_kH!  LF *I-$ج0K^qwǥ}. ]/Vn? yhq:>f; uMY Jx+2tܺݼpN7tzfCxhrˎ[.ύ7wּEWZ>&--ޞ3o3r 7t}-G]vo|p95lmrz4όS'ǻE6Od0}rr\>7J;n T:u Vy eҎOݲY@]Y(˫H;^j= AGr֠ tQU *.%XU #TuRƵhiM%^*/ٰZIEcRI/^_ˢ*;s {dڡa~\Q]*} 'pȓQ&PlfٗP+hm݇1=M_z@I%#B>8`Gu&*":hQFBD&\@ykN*Ɓp6"ngRvxDۻ@66I@KBQ7ZT2RgWÑQ,W2@N?7IPdt\̡Ť`cAlL:!0PfT+F4I^)^rr)x!h,"h fY3)'Ղ!y&SǛ͉Rx|=ʀrmyrKJB6 АE4΃Ibжv;",z3CM*k|˸˭6 O:hAht]J9Z% RA=ک%۩Lcf]77AϧuaMaMt}KwϿHO$!ap6y"găUaG*HT/-:g"@I`|wOQS=S;P:HwDejHnyEN[d#=r#,x(ir0h6 $}ŕ" zbᅴF{P<g3jb~Ms>}~(_#f]+Ov+@Y6%[Do/^,*6,s0~qݛw~cXݽ9"wf|{IOWݏS{<>'M˓ 7#2]y[l`R1IƎi뀂VR(OM W~QŚHzQ*Ē#vQ|)T$1iC !GM$gS΀TDYZ[y3V1A+k ED-CI@94gc⤦A}!k"KO/e-K}%8߈E:U'KMW+ܻP@]SqE"J0cr RI7;cbBJAy(| ˴o4~M6x `hf5{ ~;lkRI;u`|@B$S)zq$mCMKK#28"D ++C:Τc5ESΌP10>.NJ[eTުc~벰I$7f2I$Ա$Nv|ۭ?jwKro bIk!L'.>皓[RT"X%k,:p0kt>jn Z_sVgqLU!-ޠm`nlק]A(O7lM6s$MŘA J`jfmU؈&Yh55ƹNs9R(~SgeۉםN>:x4z̲FBb Y1^9Li\ 2))yl I&է1V#X^Lɧ[$DyWKԭV[s;kdFKdDdA >2pPWU7R$J?OOf=kg@<6X$XV 66ԈT㤈%j"SLuX; O+=:nd6FO^َG 5!D֨Aᴫ b*bUD\3ZL?oJFYS5:-hA*$W֖q,jB6K-@*R/ƱnKG:#)[d (QXʉl%7vt=!䖹Of9Yu;nEڥ`?qvp@NЃ|aNM9x!B'n?=9>?5kGL%fW4_{U98O6 g|ֻ`\6>OΎO,7{%[p`N}Oy=ft|Z\Jid/ {6i<~3<HFokJ$þ<WdS/])̧gaS߹:?{Ƒ8@6#pڸ lbxH.]yPcHʤE#vwMO_TW zvO̕]dig''ٿDrţA*ۏRqk8c":TGz5L>-wn^Oo?ٮ`~s:x#FQ CΙ9X&!yt5L/ #Z MIVSʏsQ5KB>YwmG1#ք6]8ʻђmW |'56 c" uhv%$.I-U|:M ˷LLT|"|jsv0ޡ٠X6{FFޟvfnX$RѓJխ,uΠ_m=De<5z/֝n]5s΂$Dޑg]:h^4F{BT $ 1T 2X%&Z$(+Ƞ{/N'bˢuѧlwucJ-Ct(!N3K &'[XR:H`14NH:.[U3XS/%De­Ǫo3cе?i+1pel& 9W9v9gk* dJ!۠LnL )&_tWE(%GN(`cҗn1 1Q`n}LuP0v0W逰ZOjRz\TkbY! }#[oz$[|此R`ɼ@lewU|qώ;"yvMüq~%ȉJKrw0e4D(~lJ9DRD`.HGqkB@ H+QuxQ1<:2,"1rb[Z )$kWg,DQh(Z>[WbAo{󕨏!SinA &\9 "ͥ%>g*\ysJөrO{6=;fO;}Cݿ[^vmV<҂gW*-Nu)sgm`Wz쵲2aDD&7&HSE57V"FMsFV;|z|?:om>B/½MdH00ٌ,Z&;Fv >ZVӍjr=\ykSуK?;_=L`6.!,8:9):].t[T*nILxwʫ)JbnB*dd: 1!yC$ $BJU[1Px꺭"Gk6#ĥl).}Ԗx@Ȍ ^`+sf"Tؙ8ˑ;vb  7~|2M̸%`uL$@oﶸ*\ɯ~76Rӧ.CVR" AhdF* FVufP42d4 @TH'K`!#a.D:CW,Gl;9s[P3XUǨg Nck@Qp-+S,J"0UD?JpeHv( 2A;GMpr`P4: } sb{גe%X7Hjh7D|X{FKkU4QstnvԵ3qz]u uwZȅCw&I|dZ^B bxk7%SjajRKvE{;\0]^:2_)0;_qfefUyΗծjV]yLz؝od/-mls͵DhZ+_C-u-r8"#Xy6۱kʷA;<%v욲;UzgkFZ|8k֖8#9Oa)qF{k<.xƥu~c1^tjZ#am`bBL1jv̂5,<3 j$ՙ8+34z+mz$b86^~?n?<;jy-7z4Stf]ۀA{ c :&ǖ6P)#SuU1(bdV :\]Z3o+l;"E4:&emSaQiƝVc +"UfRAPVdby1KcPgaɂó/uF(m11 Ptp#$}ZFA8HTѡhܫWQjX[JG'^a CN84ƑW;,SeєR>{/n:A? 4Q{MPL0x)"\t+Q8M-1)~+^?< ȡpf\ Dt\hp&S*0$X~qQZ ȸ8׳Ns|)޻; I.0G"}b@r `["r8BgRP;4E.m갘^{͟J`hlcӐRpT~q ׳`Bz}N$8l>pL $,̞4+hr9: ";CR+C${qtc|zMJacvY<௥伾j#LZ3a. {gR}=sPIYչ&' $iGlh^Y>Ba0,1 f0bɇU3dUCţ66j۳*Z$~hH|?YC +VE|NKs3)cˁ+\:gtDE䂾zi+JJr˸kMwb="&}fr#YB&";mNLj?q;II .p1B++|fb1eAE:,F4|o'=á\=&CskUHFpTV%A:pJ"bPvx^J mN'u0Qg:#X;&TV&ulyrkέNA;Ъ}W ! Нk7K^wxr`O%rmΙ67(Ol `C6z(9,O#:d!r)Ulj)T< )\RP0T+6^ԖihyQdۉ G?gh8d'F/faVCe8<+ﲹE.; Irp|le;{jnëdyj{iFc[uu+s VntwGed)?O-x}X`{~9 ERYq~J21fN(*"+.§ò dҚTap q0gcx8gki%IŚ݇E^,V4=}E(on|lmtodcl$`嬪/2.\ ߅q6HP)=rG;鵀`(Nt8W[#Ξ< ۂ\Hodl N.~C\:-F2w6}1}dU1ѵNRʭVֆDb$]"MjT~ǑKVmz;K̾j?F=rs0GrR$Z *8p81t;x2/%m>@BAfp3<x)q!|e2yiق6c)y06Dn4 "XaS"BN k{t"'A@Kzu]Jζ5F#AbSM !N4>Y0א-.nNHǦytU [>p)ѩ-h&L=Zū^)^%eKW% وn M;yVK%Y^Ȩ!V5c ,%YjDk5q~|ms#z׃yr`^|kFՉVls[^BV掠GB)qqQlW18rM2E pk;w.\mq(,޷ q|9LKp\ ~]QoNxM?6;׻iɭn8G;6ܷn~16Yjg6]lu1ҁ`t=8[<`j2RRWRFkVvgt]z TN[-Xɕ> ZAgRVs>/zo+LfFb2Ij4@`,R q2PP"Ȥ6*AKcnCeTB#N DVfn#jG*تMgmiLǟmCj0Hޕ~ aG/dbP2?hչ}зsq}6|Ig]PRa@Y ^ Ag Bp )d.YgEb|Ht ,wҀ&'yŎKQdKǖPFgXW5 (:LF$X G0F BwAKy(*ja: gw7.jFH>Nl_{F Վ]5+;7<X),-4'Dƨ0%0~Lph'3"]4uO3=;MOgyޛўJҨLFG\IVBA8iDF;SF|g8_ry'?0x yJ F3 A- SƦ#vZ_M0;󀔾Uֻb?j4E=AW]O^aċ/?Xƽs3nʶtPiy0 s9iP.s;"-;#:f1=)i{6p}@޼}lz8}֘_NK]OKmO{_?O"tfs~Ҿ_Iߏӳ(Ьqt?e&>szXr^Z Oco{"*>;lC?׿+O[BdN5ȋa24n+Yq1"*A,]|5UPa_YIfP(=\k(!&FktI)x%XD K1nLvJ;ݲ{ w}^TRPwLo?m[c@=I^1<蝇rj/JfZ'lM7[C! tGI?+Փٮ+τmHYJZ hmvdI12u!c#Ar`*5wd[VvU/uco.b f7x.[]"Z4,朄tMGE4 hu[Gy=.!մƎ!/WS;(|7p.})e7;rr(7FZr5+8Q `ه%m)"AT>r.ݿ@sv$T%eMH*B% F,,S%(tKw$b)*[:KD ԥDGLe Y)&09RSf,sk1,PrFȱYw@'bkT(X;kmDaakfMːh $D;/ _R|<3WBA@iurHoV?iD# > tnF8d%uѮYd(L†F%-(W[Ÿj*fӺQ[%W]*^Sh ̲d=\B/d%M)]x={քU1V{HsePL#蘠#%f+os_ؓ9'!W Š>f1꺲QggH?IG״KFЕjtRMqYh%}N@胊ҷsw'ثMx7d~]-tyBC:=y̭6&oߚXAfSs%֘iS Ҳ&r%;i5Ts:K$JbL25mR mB=7{(*Jt E}0lu`{v|znYnM1*kt+O,UQoE_9l'?Qc$/2rU vtL'2>d^>)ҟ_?$`-d-1KJXF}v-E)? i'cdNhDFWYVPq6KI'Z}Uxa\S߀ޮe-7Cqu#5%x| ;ųK㷟O߼xێھXp໏ ,ytS`8hA |aV=-_PjawwWA@ٛ n"Le)}f3O_(c m&fZP.ooUGkU\89#Aiz["NGG[Xsa=eehpY^BN쾺ŕF !vʅ&. )G@T]]Rڑb}7]k6aydZ~Fm((o21? -k~X]>q|ΜŴE$9ϥؒ2V6٩k,]iA)lw!IWdo0.(Ӄ?g_=v}WnE" xvIN$QvWig9&UA;]2 Q #:utP*ldhZԘDTtS֧*U+YnU-w2L5J&m3Z*7g2Tr^ҹtF4p5.u2NfrѢi e5K[[jVnb2UuKxJ3mDۤIM#$RK#SmȥBb0RN"1-.mt.4B%$k -&iueHhvwذkd6J#TEv̹0#6|CޅKCGcTYjO;ܳESz d! BmE4wiK;yuJ|"Tji/Nzu 9Yr6 x}1>5HdFU^܍S5ԺVDC))IV:ѝ(sRBpSc[%cxH+iiv`$Z4ȐXLF!kmMrzE5b84i@gG-+D{֊y)4cMDq| ؼ(D0&Dhrȝv֘x"ZjgH2qk mz@-$t>6 K8 h%)X YiV2Mt%CRIUFB9L<^"r|0`y0ב2I6n=nNNhׅ$ti#ml%fn=Co'QEEmCYk Q^y(Y=yp4vMf caB˟Lo9+͘É`!uPʕ,.T`=@e e 5@J̨'2(&X 7o#1l ]@ z#V"⤩Mk:)P&%I 9S LO("YT1"&7cHbwvyU3XW QYjЏQ348)pc#7 3 ƚ&ELI浬(X6#UK.d砟:'/[i:?T赫M}vj-z;k*>X[(m@ Zx:XBVT,]!-l9EOzʤ蹐fpJ7a"IO~K'9;XoCn֜M5墁Uq4D  J.Yx*Ad`N. hZF BB9ouhd䝩y B.8!5 T GR/Z/o[BO%k$`.B _!N_>~ɠ6@8jCkRzߧzoNNWNrW,Zhg>zF*͗ǵ~; xvq {;WpyamSwȃ~+/pC2\U c Wx;6\ WZ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ݫ*/H+ pp? bfB|+D~+dpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlb}R@dBpE>Ekn(gopE4Dz Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ݫ >( 0+BH` W~'\Q.hrR+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbpņ+6\ Wlbp5]7W?.ъ^.z6w}ް(@wDs_B~. b-ZQF-dz%]ӗmuL7M835`rCL h $O GO%bzv8;>Wپ9XQ>z4{N^\T%w}xH`}:-D1DK:=vjqAwX'Ϻ*U =auAڄپ4>6=5x۴ǘϖc %Չ cu*&f{KH?64FV_=Z|^})_շmB:md}&&goی݇Z/ ꬥpr~קrHNE])鍊+-?y(˾XI4u_Nzt@݉n 6#f폗mXT7!q+mIM>"~a~j`5ȓsλ% K=>C ԡC*]Cy}xl&}b`Ka>?{ƑX@ 6#9%l"xH"9|cH !9(#˜awWWݩ@G>8>0HPCDSXrweǍV{urnTYdqLN)oyNIti/󥆗Y2}Mr6=E-Sou'cˈns}^rKH(򹖜N)QN>eԿTjX>¥&}n:+:߉wrB ]A]{6 KrmivIow2.@S,f2g}([Mɫɦ)څDnyR~4l^:?ǹ3n=ɕlPI&t4XuzlJZ%\z±feln0e^ZS Vb[r֖t|_2,@vbJL :6T\0ePx~ ۛi̿FMz/"&J!F6&M%*vNdW#ix1K]NŹI'8Tg+Fr^.ku_d~E]W/9*Dbl/ȱHlW/n,(Y rӬZ3>Ֆ֜i]EFB !2ks p+0#zMO2dIȪBR Fje4/WPm?oUt*M{OP9ӑY:t`qhM:>A e\njOcl/KK6J`[H)*N-Fmn\e6 "(k2b{]<~jw!= t5\iL(mY4҇6 }n3vTJ gn$7(T7b'qY#_f$qL<ΆtiypX|3 d`ML?,w#2>k3r~A&Scf9&pp轫m2`pÜz5ԫǷ -oDfwCp]aQP;烒7SEfd3UJ,&\VSqdsyCBVl[xC1G"4]xxH}Te=Kr\vyf«Il$q .N)抉Y#ř㗓fON(N>땄O/;[ wGjq'2?qx`j﯊KtnOe]&IEC`#P+'BAN}h&yЧѲΰ,Gs40nq_ TK9~Ik#$i6mg| ogW;[>vM'g߆I. |yx-K$^n3G_>+n@LLF=%8Uwki= "RUm i ~&)wV uFs($HS;B=%)mC>X/Dw ύD["#;CT4qdOk"ǒVJB QҚ Q6 <`s2d>Ifn+|g-@XA*7A8RApe YS"9(7B\V5ịUր= 62RlklrD`p'5<8"?{uוED&8FY[,UN.=Իa`.SLap-D.Q9c\-̚šCRT}[euƤ-mX*nĹ:,DJIt+Gx*F:л ;0qZu cȳQwki7Ϭ]苳4}_daq;f. On:n ,hn̚lz@z ce:Ro6P)#V ^O h΂#W Wd>]fp.)< ExGieFǤ"8BafXTqXF{*ʓH&^8=m@~ PdmvʺM\(@QjD9d9BhZFA8@Tѡh<:S(嚚(Nrqh#vX*K{l܃aoTԘ{QUUx+LEEMJy݊ĀiNSKRѹ1\(('&]{2 =r*(btL1I.^PL0E&0 `I$uf^J0>OQ{߫"SZ {EÃq_derAF#F)N` -L9ӏRP;5Qϧ1o%P0 $ ܘ!zK %g !mIBH?96ِOs60!鿪 0EՌuS-r})LVH?nW\ 9*$ɋlYzN?4onWVY`.sm:|*V$ jo怷W7nfI=1^nnjч^Q0i'=ٹ^ ZUU!QZ SIsدD>Y\@mY︙)"M>|mz6:%!Oe3ܤu2*TKIbW>\gpJr_/߽w77_|o/| e\jĽAN+:0(+75okhK&r atG{yV۴]9Z٢Z%S T :V>=͞E b)ݙYIg+>oS\[)967(ol b CqXbCuCRH :TR;SB@xS&N6ٙ@FWYn t!9,wU*u-{?* q:Èr9g|?uCd8 'E**-|uדNHNNg.lݎ\qxkI&)k3_W5(*egΟaT(=o) [F)y})} E+[u0ZV pI5:~"h7ݬ̸?n:N/kFg?D><:] ~w'QEV%ǘծjuFR'1!\,Nɳi;vAc9]`T~J#؋M2NM,/k~WNc7[KiI2 SK'F/ϯ?Rbum{M/)LB Gc (bW"rEAs6"cR #j4#8y3mzAA!";5ly-Eq$\@KeVѸhbUDj͔#EDM7{,]>`zu_m%AD &`Nix;IQ]a|%T=0HeST ښ-SկV׫hJs"\0QXkknFsƥTMerMrj.. M*$%I忟Ɛ%[i )y$Q/Cn9?O|F#S]Q:"w t9 y Fobo3'2to7ް;nװP䑵kز.էqrm~WWkxti:Vō-綉Cp v`Y6Y<yhMFNlӭdf (B-eJ1L"rQ`"J=Imi 5E9ᢓ(G:b*"gERcLm쩙8{TӨ x k%ͣ"5Wd7mTB3hD6V}֠Nق6̓@W5]Z46T T)K 8 NUҼL7LOnq+{ZL7=yݲW4;䲯ɟ:؎ÅbOvϼ)*Q'I҄E'vJRֺؙyjR*i/F̮`m( N)2Z%ծshz( ʁX)[i"PZ#c3qv#c;_6ӌb!6B;`b[-ϫgzVoi2ӓb;RB 2Q:VDlHtAh]$E͖fh ed/dv$-& BF3veK"[l/QCAfq[vQ =x7-1ZOdUR.S /Ѳ#ն8:Eɠ"a1CH`̞fI2@Hdq$B=dRMZ7i&n(S)"z'km:`nLŢA31,YɌҥLZPBR1Ni&nD|VGE6l%E#pqXHcFJNbL1ADR 6 1H \\<޶t~śŒ^k^-g1cهQZ>?QJ{~|5y} m|]>t "%ש4X03̡9Sdi7Xۯ1)hUPd2gPIj!1FǴRj/z\PCc0'h=xk?g]쎩׫S yhY:>g; y:[0ޝ>:W`\OcuU{.{o|V57o11IόS'IQ qs97gks8M3\2@sX(Pv:8,ΊG g,ERa:m轨 $DEZ{pV'PD`U4ҶR%FL!HJن^u5y0z!Rdb_ IGcRI/^_e:wP<_/.UO>Dx E!2*-Ql"RRj b`,{U+ӳΞ(ɓհDt2cdTgc貎k ),mF%hTZY/$+2IuǖW+GX+vڔT1+uc;k&Ξvvp-[J\*~Pt,&%.9 FaR*h!dRH]MGF]Ʊ \<_8$`"\J_\LP *,`u}e (%jjdIOzw*"},L.V m8{^IaH pT'f:D:nw.{{ּ҃B9ŮKe%!dЄFRhȋ"ݎV{>?!T&r `@=Z/E" ]pLZ+iSb0:T0 ̣yuX3 u?ME]XSl!]?m|؋w$j& a6D&^jYl1ЦȺhS}k9J㻓|jOD@ؑA&(S3 UDrCh-Uoڍ{GIAkYg%3dv]\)@O,hꒇv#wl7U@㑺 _Xm_|]~?[Ĭǻ+]nv ?zO])ϯ}[lQ1IƎ EBg""oûz0ϖO{.FR5,Ikyvd `*yEQrxs)g@*"ˬD킫Q+Ø5A"hA9iUȐTMfcL׺O"bKN3Skv1Sނ{ImW^?pjh hös*HDYi_f^~lQD*FzgLL(@)?o]{oOv_MO0L\B `Ј$:W+o~VR[KY7Hg!0Y .&'2B|֩]gL UGZ(#,Y"/1@60f Y:rl>6E:3BiXj% 2z Y!4ؿuY$\`'7 gգy c'I|m<ޔ bIk!LWs-l>ZVRT"X%kN:pMh[kr^!-]f_+f= 5w<3w`rd*h.Zgw'@M6y/r ZvȞe 8WՓN~"*C=kk] UI sv]Zt$DQ3uZ9w$xOh~),?kI/BS1D`%s0Y 5*blR 4ZT Rss~Qz dzf͎=GP!Pz#h`0 wd H/XCPZiҾ01" "T]> 8 >9,85[+=j-eUR$-BkK%$XV7yZ=9p=goo7Soł=rs:ݖɊ93G "ZkTRI p*u>*]#of6bW]-6q.>)jq6 _jՀ׃/at#1b5IZ 2Ũ!عC5[m6OW+YlI hY#a'^$5h˟ߟP~ z,㿅I&1{}*Byq̾ g8x~\YktGJ8,/Y⋿F69pO_p\Qx>;g k,]t*6zx7x|v$ k33*#3e͏y=M37Mi4_)-{{-Ng$yYسl9KIiy2 TE:.cO]Ϯ*3%BxyM9~8r'dD9S\JtL  fiYhW٫7[߿>W_ǢQè$_΀.AD/Ãa}/Im ^ŭp*5Ⱦ({|##Wc֚wi}e-Hty_Ze2-n;䥺Ci7+ebmX7 Q~-y-? ϭqRxlv7Rh{yCЄۣŻG_śŒ^-1t\ܳL`-f{aM_/ њ^Nf9;.vyuFːg_gxq]6} \վ)5w=c]+5q~J' _Xa?GNۭKU#mw>Dő3>+=igeB @IsFȡD)P[/\u+]&:O/ѣvP#&E'aI%DeE]7=M!s1E@ yezM@j`*'A mv5aV_xX4+O[f[уmzj_/[u}Jyw߫w^?EKƪjȗ_|oXx^Roln;}7c3<6$~$MJNx}f%lGJ]Kp 4mc@_zl\JZ,isxs֖WmЃ2sH SH8KsvB7;Jl^/6r\0 qB QSc&{7GWӟ]Ou'v]\ۅ5Ʈ]?_zl7m,i.w)oOX|ӏY io^t7[Ѹ:?ٙگ@^>Ylo;i/QqYӮ 5uy(ղN^;T%a>]:~gçT'_Ԙ'\Wü p~sVW" r<?hѿlҌB)j2,ϼ0̀=Q,ĠAv!6t[ɊV=l.6q"4";G>jm>ོc/ټ458wNSWH2goyPԠ(W|d'䇠Kc6$[\Re _K2V-~d*nw^Reyu?9oQRU6bk%S}]&ʓ}~Q<6SaF/欤bK=~ƝģluryY%> 2_/g8~)OWx*hkoF_{1iWh4yeaa];6Q;d&,oŭnސ[fpdX~E}\W0..c8=[&8HOE7߹kgr/+sd_jTQke|z]E%iS E2i+5(mQ~evr?/7^+D5k'Ѿ<]?m Zpd8\ik.EZY [VߒD hUזMFFa4I YLƪKz\P)&dڝOYc\L\4>X@(6>5|'TgK4i[PֺZcPE?&%Z% gITB3JpMVdl)I@NZ:M E<KKRںfQQK HT7(־<`L"4=^93&k0Ʀf`f j$ :`YTZN1 늳=L'%z!&:$%D?ޡi4]ghPhR!TT2W!4> GcTKkށ{5Yٚ,~%|?vPB,a=&@OL"Qjw>fhiV:$ Y2hOBUs\ֆIdUY:QS#J̱6gD$GYmtY$ׂLQ 3{p_c%3 A=rAJNj(( E?iY)kQ.96ed5#(%!BB}T[2IeSq%Bbc(8i`#RTPa8I "0ȾwfE<̰]gۭ}[7~rٴ-#Ȼ ئҠ(yi:G&[БJF%H-GKgc-EMpE A-)J#H VWe2'E"`z ƒ}L\t辤" Qktsx P, x.cܬHp+Q^ՀXŸ3r 6Yr& A";>m a˵o;:o1TS0+Qw@tf Y B**r]H_.$%LD) e@wPJZ%LFKI(Dǎa:,L1^|6ٕ;e,u惙3`j!hk 8^ =IȒ,B\9vEQYLg'J!2~\ w`Dd*p(y` üO(@AV8K$@ DES6@k.?Hce!gѝc(X#i%r[6jTJP/QRpQo52߬t Hj/V4U }aȔZL/m0Du3; Fa{0.Ӟusmg&E#.nnf= kpS:})(r2(:Zm*5fZRAΣDjK7fS^Eٴ7L@Œ90$Y%\`JԃX&RIP B>^rc X߼UaӳY|YO7؊PDRe 4IUU;IY'yˈ apaO1 :-FjJ xT'#!SsLW=gU467@ أtF$ AT80v*`c#7 Fj}ҀcZ B$r%]&C` ZPY[B):9ݳVKy|+ML1 JH؝4zC Y0m@…YX)*:LZ8bԋA+cEq4S"q&d9ݸyNc縸vcYc6FJR`eSFuL!j.*IBq:ub鹋:Pk\7еWW4"􎊷!fW㲭.Ӡd":l>S[ѕ?@Ŵ!EZ{vHyrǽ`4z/*^UݗSw~oz;|0ӳQիa^ו@VcX J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@v@dQ?'%5$Q\e֒@xV}J gլb%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V}J d 랓|@}?%UWJY -*>g%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J VUE@ 0R?%U(`-WVT)aM;X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@ߐzT{;˾մz^_/olڝk-}:_]77 8b'd]Țj6Pa 9]y{+#r>Bpy(;ƵfpYmxه`K5$[\Re _K2V-K;xXmb .yaHFAoW.-*ݫoڷV`I\3Tx?\^0ޝk|mymo}<<{$/1^ ?:?|_׎1}yU׋E 2@iACZ ,"Ÿ~>|WG^Cq݉'{葢<1Fg./Z@R>wW~0m WgYs.l׈N =G(/=éǀ ,W$õ,kk7}#>E97˃_ n79H ?/cEڵSzD0뾣gJx8_ (A8٧RO,>}M\swUl V_** XAf&XZ~u]|AUUYAor~ImϫUԴ%8B/sȡ`ލn uQ4?'!$\Q"M Zmnׯm{}z)FWݾipFyΘnVSmι s!nzms a^ l y +Obwl0,uo|ճ??98it[vwp_ި] _.Wy{ns ֺ#!-\u[H2̵ŠVV u!ew9n#>!%,GŜEF]"|#<#<@陉i']@GΨP <q SGh5$$Ͻ?8mD<OXOZ.;{K2@Fh1rLƗUz{XM!p^[.))Nw.KZ;+ұ^qc6rZ6gnт[v ^Nh05gk_GqSXj(7GϲMϒnYN:}E/eMxs,v>a=$g2V (:o;c@0{-#cRD閑ukl U^j ϴ*&ys:ߤ({]yjR3B:g؅9h%:iWKjP4ZoQ1: -i-\pj`˪lirh,aˇ&^NS.u1y_9.JUp[*̝vhBJ浶n_Q׉yf}HtiYynW+-{|n׋/VͦMۙM$]2"lþD6Ml ui?]Y_  WP(nֳV̡(^/E_/}f;kH/"\: GE"g8a. $՞D1:,zGݬ;ʓ80)0pZF)µzqy2՚8[^`=s&I.ť KM'(dooW7[ӪBQ)#-cXZǍp4jE[eu2,5`UV Qii  B8ҰV+hU8Rn+帓F6+ܚ8w5 ȕX9X;B<怡Ӗ@UX+<)kf21 Ch,0w|?h4d4Ŏ(yP ) u1"D9w8pgāXgDs&6Mrpf| ;h˝Oe}t=\K- DG"樠R쇘3n"tז8R#*gZQ C02H}t>Pe"8KKuNSU D1:(`1/x=:n:q6 ޞy, 5 ,:w"-  1`c 3*)EQ8\e9 !F,Å&E:*`in^hTD)BBcRY03,*͸Jb,"Rx$'8j7 割Gfcn-"FJ&!yDW3igGA9(Ut(Z)($i!\`ȫ)[%[ St܃aoTԘ{ѐ5.KkOcDy݊ĀiNSKR5t|KK>Կ?O*3;|E1IγI?Ŷ%&mgt(L}k.JP _gVܤ;%n'$]%}^E">E:,ug 2tV J1uS>v1LWy} f2fqMRg?€tE}EҢW'e<0>dp]2Tlhʉ%go󇋞 jR~۪4k$s]&V\F3(U.ϳ>եxגcyd|Y_x1V#R+ۏjlAe2.о4BCh]Km-]koFмC?(jHƠ t1k{o\4~emnVU!"EֆFRȥ'_1KL)lPޮi%SVKy[Tۛ~@' z6MKQR| h0*}k-U[nϯ.{ߥ_=?^<,#/ūZxe\J@NA-¯@CM훖i6M6{]66opʗS h/B}>"'3Ŀp?II .p1B++|fb_r޲XU{~ph&tЋgC 5)$HNI4\DQK#A >봲Sdx05mġpȮ; YS Xu y4f ݙ,y]-x*ykkt:&W\S 1[:| Yc,2*jfӋC R (0xBsIsjEQ{0rHRIpxaR[ERcib S%N >ho$85q&p)D#/KNgL&ٌ^ee"k,& t!- Y˔M}UY_eE 1ϞeQ.׿zߜ+0p'%7\'_,uu].;ys$99NΛkCpY)JugPU+3*+v>+r޵GheV%B>/,neͽm}(2):H ܻx;e-u^Hkn!.N`?oG8eg5Vu*L\?D:2>CЋ2#Uňbϳ9;5J+wcT js$0QrsLfhD0;^-=6VҤr彪X~nP0^[KPIJpҌrP,F5lk^ujv'hc_ztRz h`/ 2`+ P,G"vRk/2yH-M4ň1ARЁy6Z^PoPH(qly/Eq$\@KeVѸibUDj͔p "JFc&=NPx =)vu_m%AE &0-w^G)ؓ>r0x7S i4xAY/ꘃ&8G߽w=y;9L/&e7n)r]^ٴ#;.5Xn1H?ӵK gAnXFڷYJ{JD ]7iqwt|v<̛]L34ʐdL4@;1=6{io&-GtLDžcbP6oC'% 1+nsXMs 4LW3]dQh-oOXدci4+J:!qg`H+!YT蘌r ybDr< :[I5/zyDID`6VvjӹQxV oFg۹CG5;WUVԏ}WaW=G-Dnk3`p.DySi"H"JDzH6ӡ#C >"Sʡؔ%=mRvA$4<`2ܐGr|&զs72*հe 2 wW. 3>yJ˃[ux}V>=/z@$H2V+[*3@4l E.ctu9Km:E1j%LD.8RVA)1R$-x*kTͣ#=@8oVεW{䞸Ye\t=.x#X1d5AĜU:kxpG DBqq/xX:Wj<3P54_vܪ޿̒7bȥ?n!ՏO~$$r ⰩRi3QY*Fqo=6,'Sx/ =M6%`X)>LMd(NY#eDD7T3[ DcE;(%:ap9!@rC($'fcjBDG‚UIeW=(nq|!lBTv.WU0@R*1 pF{ɐ!,cd~9L0VC/σ{?;*GbraF等ځ7` Mb@*#ny!q!#iUJ8aJU*fEkLE;YϪMgK?ۻ?T-JHM?2dFt6ZxNKxQsl<,5`uC߼ _8OO"B"Wp!6DAPR&dƶ,s=ժFbI}UFKY4ZH Qmh P6К M⥀ILjrb)Jz{-MnA`o^YgJ^]b bB2).T21[$1yW뽭ގͭyCzor}Vvs "#+1ZǙ,hϭ80kJ7^˔bDJ W=%۫ljr_p}gF?АҬ59/_K$e-H&O#6LNͧc\W.]%?FAvq:1)(S*7*2\.jFm_ze/4[ȴ%= xg1J`e&ɤwepI%.6gU.o,FK,Ȝ,iiq/iYo&Vkҕ&ߢp Ji͕RTuu6 nT7Ψޕrx_o VyhhB/ o @30 d;- jue+J(1"KDmEGDVYf,MH@ym&bN&3K< V:]B&Ha>!^qEMjӹ{0q6=ߣ>)w~3I({ 풬6[s;PS7x-YIM+S6IH=$Z~LV-4wV)!\uw^_]Ͳf9JН{ o(I^=!ko6Մ\?!1JB }h\Qr˺(B !,Y!EHI%L@oLdsdl@b|l( 0ӓ!߇l{_W\ 91eR2SmŘL^R*t9I9RƩ4` % ,ЇUKֵj4G| 2:==Nh ՘ X[bWi2c ˏGalo>rrl+̕P!#ʍuA7V b'Qu_CT/2E^ dYH JѦCtRps`)YA6`Y8.{#ܢ`0w;{SPxӂa22.Yk:w.ttM 7ZRiOp)w՗,qXqOd^%07/$OW6p KH`')L?^>'E˚ .oBk?`:NeFí?Gr|0hFG7Kq¿44_h 'SUۅj!\&mmq=ncЛZk^++}`yK;dt!M/k4zeÃo}"j ?+c9{omkCh@(R@&Z+F'=z~rXTo5,>iXk&&vlbחV~<~N5'Av*tzS.mSeH)C3)7E) aO>nzPX94DjE۔܍9 5h6hJ ` OpjDƲ\nh}HTc6uWXLbքJ0O^'/麐dmRiZ:+O+S(?ƣpVbNxX}UT' J x-GY4gYH0\Ë́+Et;AbVz`VE ȂN6: ^sQD@cR@,dgʨ^aYEa9 )p3w:Dt)|NL>ߢbGqiVюfM4:N!(X8o p+G(y9sE:3-ާz:G<24 ۘ#RbѐqrVȬѠ 7٘yj%_+o|O)350F(tkf-m4|LY1"ZTʝ^_?oNWEn?jBWڑ(=AXWL_ŁWnn2/z q(Aqg,2sVG&N wHi>lߧABt* ܐ7EyNx7x яf8O+O@?L}:IӋ3~\ߏFi>O`:;><MKl^Э&lw ǭs刣^lzF38*"i+؜d,4Ӝ_*k:j>rҭKQ߷AS'5?LmPP4Ewcs}?!ۑùӜMFN1sw$%_m҉׎\3$0`$p&^|0}O`;P2I, ]=lHʰ_ cBmr; <w)E 6DjHq9Oԭi{D.KRM)5E!VǒXK錜O.ire/dǃS(ǚ6ql~&˭ǻ^@ ~t]V9qgFsN(ӑF rsFEqbh#%taYM&~6J$Q"d\yS$hE,A Zv CcP%1*tA͇Xwxc]lK;a# Zg&EA.zj@9Iؿd];rښv:_6J3-".Bv!c &(DJ`9t@9t .ۊ\f C+a%Xh-Xl `5 K祥'zF2e@URkb ]jbO f I<7Ҩ,GկϋM ;Oڐ{#ϫ6%x'QӒQ%eX-.Y\ƞN: n{ \=m Ch Gh"s&h jkf3҄Ec^^.acܜ/g c'dmjjA2%ٗ5lUD`~&d)UNhE?1Pp& Wh%j {p^{ym^lgcG{7WwbZ^syέtc !VWW~_*6bHJz0b0neXBlFF Y,;h8]Lt5f?pmU.rӨ2`9_pFѤI{3%8=^ZN%[v~\J[)K_7i,ۅq JEx 0Uih4nUwtSe<x W[ -ehS'|qG0w%mp:5"tt2lO<$> =I=Θ$ ,`>D)l FW>K16& u:飞  R7Z<3/E=]FGk3v*sK,6ӛt) g_AEu{Eځ.Ζ}΁q-nxq~ջ4a6HoҫNWi߾ ew1Z'ne4Erּϫp~Zܫ.oqZRmҧOMoE'M0FM5}zM/(wmو-o ;U?&a}9g7Y[A]k%ͷ70OTze]B/}QyzcGМSgϷp:$1$^뜜[&}Ђ6ǬIY n=1Ł_|Rݚݙe{J)wǿ[[i…3VMS5,-)FgOC3M氭3Cu}?M}=r9HY*GI!]ltEppuyH&iWN Qz'L)#G1%cb9e }LF&03c͓;kC_Dd^ cqrf&f}v:q=ʽաƚ~kl[ x.HJ$}4A2deʱ@Ļ9&kh=^k7{֠jD*6/%否WE8ڔ`yU=Ky E$KJT5:>V_ۧXUw. foߏ:>J  wa@r pC=2 (Abp1d*fkӁCB އ:wW)">|ཛྷ)t3H@X FE+; ^U6Mr#M}7? G G\–OHa^D{u2j*һA)pmPwY65hvJA*Һ!G>{4ِǃpZlb:P{ >C A0NDHh'񈡞#iY@cۤ6E~lkᘶTIICȊۜ ,&^̕ZOki,,+b ːqRr{%0:#gGV}@霵灠 gd8kDY%X߸3A0I O ZNZP*ɨuLF9@rʝEJyЋ#@&R7%19/Tk˳1oGg|oas mnշMr'gMQy=]>Z+I<]B0.wH}΀eBSi"H 4 L2Z 1$cCRlWڔ%jH<!YydɵTג3radag,e9g/,g5)xCf3yz;t &|zП;WIȀ)SjeKf@xhf & ienOH+i(ʞOHHHQĦ4 F%.A&ݎYrLY۱%v PvgܱԶKm{G5m ޕ,,S ;(H\L:)ybAGd"=Sy\LȐ+ҢD YSFpr$]j;T":Kn:E1j%LD.r&BFic$MZxU!v}$3rM={H.N)5X.\\<yw<D#^7{N%g _نQ4;?Q֨g~\}|j."ag҇VQéǸ-݇9>H1A+6R&OI f)2Dq/Uk4Db=;"XbWlW~idѭϫ7 (˴Zh{lQӪ?No$.6+'W8pM9^OoƗv ̄ڂj'6a Bz=z{'Q963a4j>-hXGXtJ`.S]їu=4/|bu;ipg¤J=U^D_ ]{iYa!!(P )gJJTAjr]_cy4r7{ڣbm]w&r9#ڇ>"%wY;vݝ"r+i-&C=Eu6ӿƶBCMkj<i]wO9>#Ujal6R˚Z.wmzy΋lhKۓ{y)wHF7O"O%ɒ5hkR~i$k+qΊ phiҦvNJ<sctӍJ 5RS9^+iRM:?AJ@g=os@%̱SLI)J[5Ø@RQDD +iL*FfǬe]JcQI aro7QR#0ȃ .'ʱ }63t0G[TEdqL9uW&y۔l6)d۵M3r>tjV}HV\G!ׄu,qwYDr1 ,F㏋9ˁ5\d93rqZGD$'t8RL*08F#/OM*xJ=1Z?InP q3G6Üz>7Fg˩|樂Z?{WF w)9<t>x !O-dlwFS%ZJR\%SUɪ̪8_}\2qLKř)Y]Sj)oxIkݓ!o[Y(0e|)t<2I:T\(P 26̘'c8584u-v!Kb2< VB֒ py=GH1]1M׃eלta6FI>EͯWCSXrQP:mdX0钀#(Q$y,X~IϠ>HL=WȀ;-K\DT9LAђi+SG>R^yÁ<#zW_;ժ)ulxȲXRyڡ(KܚP{G{xC@&I$aN"E"Z^1  oE0`'`q>+eL,ҧ s2)EɐS(vV|[wlV^Ռ|8ߡ\F<\4qJM ^^GVZ[Qq2Id xJ.qsH$p4KWr)d4#xyDVI&y>sP4ۨ%PBُg/SJV!3A%^:RG0,t9FoYrx';^3ɨqkPubH4bE%Qq}5?ZNG-^[xvWQ80Ф ն >Jx/ ttb\q]U&q̦l.1)(kE t7GGI';yFva^N@}&H\NKϏ{2B. ]7y>-NqpxiA~2b_ mG_6 ǭsz{OHfpZpƓ6cs*拂js7Nϕ=HTt7Js?G7Ro~Iasov1Y\&;0;+aBgDL4p|tj06QN'prcn?ޗG>һ"~wm[^h/;w~4i@0Ӑ6M|0r9mI;}?PB}dS?oRhpH_ ^Q<zI ـq9F"y}ۭo|\XR)R`>e-!n\Q5VgI~9uò/fbfc/G  nX!*IсHK^`H^'愞8_@Hd :f;WE XzUTG0^O+TVzlgʶ6mM  JMsUN`d B0 IrXRrt@u4 v\yǺ/y\J>Zye%B HҶZ 4JDH2gFW7߇i2Zr@eOv e XNL)MЈ~5tEQ#å4yj#C/L3KhBG"Ad/PQ_ՠTDc̒b `7<Ж,#tYeK,rR͹G[;>{![0;D䞥3*N1V(}fToAaw,;2NirLdVeRR343JyR|Vm:;YWDZʇ{#se3H!YLrNy\\%c<\F$0ѺjkAN?3ӛL.5\]v`A)AΠ2D]A0vvRg' Sz "R<ɺ <*`]c+}M7}-~-,k/ᚤ7T}Ѩig Az!PN޺{tx/ El$Ә豑" I_qiO=^o +n4;sy>+u<,f4n6#w)oOn]t^r\PJlEIYiE5OA t|$y| 0i h;Ln7{Vӑonhx:kijy`H0aXgLɸ+ ; \tS){LGIm+ߎT`LK- 05c!7 <Ǭ8.`"$p6Xfg^>omTDlT蹎I8o|@j1Uz+#0oV =4ƏtnmU:z&tD`2/3kܐK60yHKzQv7yѰL ̃ S =$D`(06% [1$D,+9A Hu$"R+RdQl3p!+ 5Ƅ,#LڵMaHtS $2D[t—PP-gBAw+<9ieq/k9kucV C5,HfA{V\;(ձ{\#.]YT `R,fQJ!S<' AyIX #e5Koө[ViL})Bl|1fB{؊ F+r)K*h})$%,Y&6xQ:0i}F]^d`z˹q$8,GVgl099tT9wզ0߉^{:󶼴_Qy]YdYboN8pg#ză%Lpy)ZQ\ ͬRHx7ܔbV_WRuBZgU\K[CUq[ՅpztjZme!gBZX#8"dU^H)1_sH%Ku}d6^0dJ(7TO8 TDRb@ɩ\HJh ErT \Я$ l5tϡ}`:G}D y|t&!&Jk^C`Xd+ߎ{37~i63w6-BK=(=µ,}UhÐW0z՘Zf͞-E[Yz'.t,XE8^: @T{']]Zfc U7 Gu$r$27 qp?pBCdk1 6WI}pI&c9>1}fB#r3IpJ zd鵒PhG次K]gZP;,n?zq$@7Qf.Ƅ6P*ǂYp`L̰Si,/̓EX9=cSN`핖1c͜gM&kBc錜 HP{a9$a 4I\G!ך9], zǝeX(^XCXopڞɘM~Bs 5\d93rqZGD$'t8RL*08F8W ^'ku?bZ~͑ %|{&L6w &1GjAqɠ$w;rǑ2+Lm҉ׄrPm$pw} 8Cqݓ/DbthaAɻ|"G$ F yb'qkmKC~dk2Ӧs'E ȉ{<3s.䵇 CbZLQӷqbh#&ETmNXH1) 0Xe ʫܶ0v8h.ܞ{U޿).Bgc&gCL]a,hBHrtѳPk͑гGa}hoW6Дf"ɇ T 's :mv\E/H ]A:9r*ɮ@9r .ۊ\f C+w{:!w|b5k践~OɔmOF^`.ⴚʙ xjMŒS$MR}rSf#~şw5>6{%3 GtV)x@J}ť%c8e-.8W`20|wH ^k08}d3$> 7т ٲY"ڴ9nN׫OEwķdJOW\?ɓJIhIT)u\c .=^qN[N6PՕgZ]-n/5ogb,Fy8xuX[6k{AB6e>k?;qT1k&;gRzLq2 AL8:VbǣB79MYyCu׳Zgmʉ0\HX4W'j82uPփ0Vߏf\Z_pqAot &*~5O/颃!׮K jS]<`\䜄yqX}o??8m<;{{ 0bH0$axo}Sjjo5juO\}.yǼ?] NjԪK~@`WM4?zW6ƍ6U4q';I=0IAX,|SDW8 |.>b mMAP9tҽ+RCr:ͳ"Y .{;Y9RdMɎ etyu:48O&+8Ԛ{otH:ՌBN)J+cmʒ R=mRvA$DHV0nH"V>Z3vF횱;ҙ.3 c]hz]pMQyټ $-.߷wM ݋nӧfk^e$%LN-A qA6g2<j$ ݞ^VfEP=&Mi J JuY(G䈙E]w5v%Pvgq_m;ڶ=l ޕ*,S ;(H\L:)ybAGd";S}\LȐ+D ]SFєKH!&w;#v}%]n:HXh׈F$4b6̆((MX& }* L$ۃ E.ctv{stlcJr p@\LH!Z 'H X:w;#gQH:^lp+.^Ԭczś8:! "-Y ǃS,0=Z ؠF !$P^/>^<}w>\wg}*D8my#j9sa=Yr)"#rF?g?~Gi9x1|;g]du׊I[bC}㗙cp1ދ5ٱ dxTR`4N*Xs?Ih+==)Q&ux4 Xnis XB]R"r8`j4Db;{"XbWl֖^.dOeڞǓm:ZTS Fɼ4Ӹڠ еx=O^M}3ځ3j *Jt؄%C`h ݇Ћ+t>ϱV/ ̄qrYNjIbQSMC`.S]їm#/zBSJoz7//8*aDV{vwDߠ@u^.GK6(N8oF%3dWm L{Z:8|veb/]ISNjkdCoT֞CwgofLV`'57]F}mIFsض_hn]/yxZQsˠ{nR;7f; Yӝ+o޴l/y3?L[]Ҽr[^lt"71YW[/etO}$h+^:gE>+\=m5O.ai[jE%RucnW$q-y+[Et_j _EOWXW'-Z-\v|:jVv4j _Lk%6zNb >cJ4k3(LtyrZՠV:(F$DP&LhiNՊ+UIrR=)vثd⅔5_XUE& (&$xY ȤA#OiL9Jk40NꭈP.Ɖ\ψITMD|-`l`$ Dk9syt8CŚ UsPJ$b78zSk0WiIcV5.Raˎ4mkwϿS $!y8C`{eȂMF(4`W̓" @ۜ3uR3g &q f6Y` ^'Ͻ:KkV\HϡK@5)H ѡV0ep'mHrٱ+-sj/mX{iĽ*LeXVٞoaZrY9/՟mf=Գ52(m.+ d⾡y4lQf5}Jc6AxKrjٺ'@_#O>$A&d ! 萕eƼ\yˁ&U]㚐 1NE+%6j-MB.3ϵ'b";.ʋxպs"qIG_crMY$}`rF]1Qß5~p8w.p tWƐM,Hӏ 9P;+[f 픷K]Bi &PTCR|7i:YMr)eDr05)/[SRF|c ΍u4ȀYaRGjt+IdOa;65~;;1(* "c$6$D1cyB@+ QԎvuO)CL2,L|RpgqUj,K\7 f'l w}䄹7BbtrERT2׮S;۹]~t| FbI;sZ״`]?̓Ȯ2UZ1Z&cKix%YFƲہ֯D4nszAzf2eA0 Gw%AGDT׹d-b ʒT2me vH}vRz vYoJ] vYA*/@;w[jgnIW){!8dB - L[E-h[3 _ahO`p+5xWءXOA$eR O(Aecۭ[=wv @SrBH>> |ݚb;%T$xd'J"CPS*ͭ+E>C"9X}EyK)gBAs:g45=yiG93EJ Z5,8|2Xe0KTV<HEOnF>3Ol9e"5y #3Ҕ*H2bE%%udմzkuZه:E6;NVK5xQyۓ)JKp3˲nIk3)*KL - $guF ;-:3di5#.eم,wj{sϏ(=~ C9:zMfD89F?x)N~0fOŔ`[\sy_^t\xtA=8.N8c2 jSW@ IrT{?91ͽa GR7>&gKfK=0;GN9ĆҥOiOӹJG9;\?K}G]yƊfa񇋩=3痙T_0&ð?X|Nq:+z{'D1M4ڠw#Wԅipb܊wz=z_\}֜lSo~QMzEg鋓ŝ_N hj+\ ˿Ł)Ȩ9-Knk,YO[@~] !XxKޭm/ %cZ$掔b7'2Y3ʹ'ؤ7Snn(?<^YqZ?~?zc{Ba^n~xs2|N>N}]?k\ ɖp_b&O܅9}]vGs:xS\m']l>[M.4ouxS hć)0_Q}F ۯb٩TXP:Qcu[yWhU,~g&i ڠQCtsdU jЭTYב}y^G*|g˅JiJPt`(璃esBOu .&"YYU1zEC>ilh5xTmՀkm".{[23}ZOꢭ( 7jk;MGÛ4چd+_K0d4 kIK.tӖ(J뭗!Y"YP1!7t?bc`Ľs&lb,=KЧ୍(c W$72֝0H/$yJǼPZȣIvטm+vok[vr#-Q|ZgҸ4n`0 (jtj);/7DH$Dvא &3¦41w0F4a%'7DBJt?E}16W Xo5! 54WnZ}Dqh#wx dD(I>O}Ҳ>y1A ]z?603{0{y"6`iLH-5]}R"ٙmu$ ԻAηqPySgU-t3yP.Z]ū6C~cW 9x/A9sr V(%czs4Ңt٧ gb 9ݒ 5m3kgUc]Mq}r9бqFYZWǠz 5xz(|k_/m~r>ҧAlmOwu)+S1[+Htb[7y;<k| K.v_ˇp`\q5Zgv*D>l;4H Y`NfIGzI ~=WT2$MlIlKxTe*T܄Ru{Z^O8*S^X)p1MTT.hB$%F \ nV, q,KJwxPNLsϮhtwGb * ALh|)M1O7ϗsⴆ0f0dL[#Ki~vsOgNUI*gGLLRFyN򒀖V{Gj#lW&k+e^@>K;:j$ z.ظClaН6<%tIG-`P8TH,H[(E")7`n=`9fo#{b9W:g uarr\k!0(Hr`W|+z$r<Mܳկ=r;hc5~ \jx,/Aᓒ|3M0чnXv/3i^YN_>Z>)4+Cحോu$rB"6Hޒ(lBsBo5?|Zf8|^ >`@0sKc=>*^]*z:y-6f44Dd6m>WoK3W?P;*m2~Й\˻>r}u*gX2{G;{Ɲay8߿-흎Afo?5m%Ra;ޭ>B($;+o21 פFҋp- wO×2:ɌaJ9 o춸ѹk3SD pzG=;84iZȠ Pۍ-trnm3RTY7ICQŘ͋BIJB /ڄ{aV9 v&^咎l MuS nUA׮FeǑ mCDLiYh*qi:MJw%SdIW=$V*U2(yCs ~XQ|YE )x$ LjyU,C Y0ⴱctx }Y%H Vx$ۑ ox6cQY,TGW>/A_ż d-Š.kI|ӧU.}t''},2di"YrB,GXktdDPHc ޣ.Oy=ŬuԆHT—Pw@10 1Bk ) 1E-=f !nK)5̒ Xu+P(v,!D;f brogcQ0%)Y Ǯ6 3kHQ(ͮɀC Fq Q8TDYUQõ-ƢPaVB]$1dlDH!(rĪMPk>ZlOvڳF& h%PD[vjV)Pg"BZF7k Бd `F}+%LalmRjtjϴ㠝hxha\]/N6:LU#A[.n[ftRX63 v໓("bjЭ(ZkM!JKBq$RVO ]j31ig0IB_LrV{R@pPaRDluHmk"* F mE׳NJdt A2 5@JAO!Ƞ rwc-yÖ+QB|?a+Bq&SK:)H& 9,R7y/JFF-LʴBE5Ff,(*Fb1;L ;CTXD t أt%i#*`hAgМۦwZ,̴RHk֬Ug($u21KCpR 9O:'/{iCg;$:TY%W08\W=p+;DUUb+b+b+b+b+b+b+b+b+b+b+b \E%W0˗\EUWg\JWb+b+b+b+b+b+b+b+b+b+b+b \9!^pE=y1+_Z#;pEVZ՟2HW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pW \1pY+|/ BƝ/?{x!WFѭkb+b+b+b+b+b+b+b+b+b+b+bYO ܝW?~wJ;z#5hwi.WxA!nn'kv\\^}Z6/~lA>sz<_lzNow ?xG~\CMt0}R TTt8i`l^V`1 7 ꓆$X#]sBI-YzZh*3ڃ6sdUJ=oq.{PPK4~Qf{ywh}+|v1R^|cKKm99s)s ro*IM>"HģD-P+cF6J#yGv*n9VoehQJ o%#WϽiDxXcT|.Z|!"v}I^8H> G%ׇrt,r`"A ??NePECt{H7/ еSb 0ݑP<ǣ>.~j<7 JldtIB麈:koh>}Vf%T: 'wEw|z|xΗ4yˇt@7G?#x\4.~:v#~5?Jm``Ґ !DG7=R_rOhUʨ=X:/g__^[O&Ն}ͩc:K:>|u8|MeHx%#0NjM&U8)6ڒKƇzIT[CWB]h҅[`hry^%zs}.ӝ+~\o`&~ʷ^ZhM&pov@]G $w^}!$\>e/)OKm1@v-$8TЫ~{5aou.F_zrO}@ op`7Bd}n?q~}?q=:vrηwj!x刞,xX>:򴅃YnJD(.$ ( `;ismwmCww6ƞe} 0Vqޖ+ϤtUXN!v//nqF;:o+Z\~l{MV.^x k o۠+~i|rʹzmy??!-[R]6f6:d5nPiw-mI (Av~?|rGlEF?-)P OI$EQMNƀm\ `'*5ڱ1B_{aNAQ}Sp q.AAG5=:M21(jDfZ8k$ bMe#Cx́gHzZ `%7 f/$*ѡ $*&Ð:%HdD[u3s< AIX 3(( ͛`}^Xa.}" {9`$%`x=ѸA{¥ Im~_Ghk즺O/aϗëԮz]?]l X"1qy:T.bK\܈b' 8}'Wן^kx(y6K+ߛQB~*65sC2-Slg2PF0k"S𠢷Xbo$t>4ӒhI}U6zTeG鲩Y[t;wՌ[&@ۙ0%M {&@s=Um:k7iXMn7#DJWU jA+x竟? G7׿Fn C`0) `Br~5kxmoڅ#f]Ҋæ < 0s.xH򯖄ZPnkψ3\%5TG"qcå!_J.h "&6*bQfHY0){ ƒpVaX2Pe4t}KCN(RAx)Za&R86.R|Vb i|!ə2? ԳKs+;ܟrZ?Ab5ˢ&ZdFo C*0;~ eK-I{ b<7߁9yzٮz٦h-vEgYe) %Xjga؟ǡjꡙNTѾ.Z(^Y^[IΠcA)r f4ɋ5``Q*bQG I)):U. Gu )E/(18-DY&1Z)ϑ X;+ܼh1q4(w¦{~#? į.,S12HB3Kf !h}xL[S3MW YnX{8uޏ/aޏW.qRgjf 3O0P$;@'sݛ;7GSqse;6{Ud*>y]i宧bඳ6wۜe#z'/w O}=!uJnTY{V]_ SfKwcf۰CFm{b(]CX]= yeݔwf[!䂭߿]{U+$ޣ浒^d^ >3rjKs$>j.&Ξ7dn/ş$Pmqx? -C+Uԣ31N{!A:xO`#x2n2IE4ch9 -8Bh9jca4(M4sL&dgSS.&cH0tQjq H%`ho*Yj IxǍEQu]woCM'A<9x(*Qixșl05SV " Ni1D @E`-'7MlXllڡe}vT nMW)hrQBum~+MޭT;sni^Gz ~mifOm|s=zmR+?̃r5i.w_ B֖նffO7cZwYKF@76WYUFvVB=kb`_GU_7rL9j4sX)'J[KVEofv8h`Wl>nE8T0FUkC U׮sԎ*u?=B| ǿy/o}_޼m}ͷ^xelWm"(d~x ko߿iMK5 -FӴ,Ӯ.-n` fz 1!Ů1ښ0ģ$qԛkD$43:e|.$}"+kb1d-:"gopX(]=`o矬 &财fd0&`$gI)Lvx13"#\ȹNq)5'N͵怄Q>xVF' syjqY,-r\hZ8k] W'9s 9:zBwh̳ř1:t)K7ln!U0ʘD0Rs%&@ $%hW\[/, S2H`8=1BIt)2Z: |)q XňmhK#Rh8$ti#26S}w}lů6ĒweUװ 7%F!z[%E_4MU;uxO_T W_Wp咿*fҸ0TW_Uk5}qv/媳g(Yuv,}LZѠ/"AU$+8xm6nV߯zfl |=鮪UWu&ay:gL?.΍{w7A\C'U zG s\T*տ1yG%(SUI8!VIP128\EeN0?|'NGgVw\B;8K,t)r-n54(, Eb$],WC5rQx7Ѥmm]'sy.DSFbR0zR$B—LJE`X\^fs Gn 3rC=c=wĀ/r tv3>+vd}84Sƀ ".VdX 6@s5B1{^kKױ !1 )eA}<$__8;# ;a~b6 xm3,O Ov4*mBDZS |8DB-ME _ygе#:'-uڭx_ܹNժ"xs6YWPt(P|8Ê}!XeO Wv?{s_7ܼGk6sWnޛzt 53%a^y˭->eàeƼ\[8'j ZV@jTObz.fA}b qQ1F0+~X q:2utvzu~Aq.vz?Bq(U~c,pɃ0i8M Gz/Ǿ?׷ȫ >G(>`*\xlM1=_2UkBoiBS[iӼR[xU Em{͕YRWzsb2I:2~QnC/ŰqX C4x2y@ Fϼͫq4R9JQ:8QrG.өn:w,G-H!)pKySQbɍ4ʏ0Ȳ%Ndw-6MNe7<r1IP&sr., dgj[$mGooU~Ň\>F*LLK'9)t}/yw$Ua ()1 ^ V* t*yQB r ޵[!AJ-Iv!~%0(3Y`jM6rZXk0>q֗hT-J{C_/Ǡ6 (ټy?g͝W |tg[D.eS]HR(g9dL1 ^uZߚቢ=RJYC5:)h8d0t_Ćy-6xV[r:q8\ R=AQ*F)L>)%k, "@rxŒoSJj)11H&R gS:Z`5FY]nX%,J;.^7i99+~$ԜrIч{;5~Oav\yz?ךU,_Nz!&{1-i_UgNn8O'jLFL1h\`,w#^>AQdLqO CQ M'a쟞h{0'@*ӪI}_x2h8:c|{:tqPOu]:J!i/$qq)~ WaRg u,{'-~(^*>C%K-.]YB* WT"^T5W@һw߿UZd^H ^;(bEb)@UMQd j] oB .)54 ĭS4IhUޯ9W)M#y+^G|+T+?ںXv=&}ں@L?4kd.]Pl0nL?Mg}hRӦo]lO)°B.ѣU\j}ݱpz=y' xҿ\6w oyݏ5Դ_i ۄ͔cZLEv}_4w/nŲ6Ȓ#;CJ3r4W{uAgښ3kI=,QE`DqiN2<6vHZYFy~=-ZͿóTV; H*DH)(R5_)όf ^CgZ1zUtZUkHSZ-Uq۪-q4kprA'[;OT}GՓ̨!hhZKݍA{BR&}4htDPƸ1 yAypᙃT۩@*p\1'(EP!U& &iBKJ"HJvP5gOGd%ė(әiP̅ khkTZ*Z& A3)f G+cB{V YKi͖!xa 0a!c~Y yX %h Vh$ڀӶvDoX\(_k"閕o&ц7(r℥jtZN <0ѻ@ ԣ=XҏƟE;Zmô%n^LI}#lRVMY_?f$rt3_/]Rfؖ =7X;4Pe3`Pڤ|k$GJ\'t KǙ-%: _E#BvAxK[I?(NlݓO# Y.$hȠ#|Dq6Q) б;4ʡ1|FrNn{Īs? |u~)ܫה):i ]|Ic`$.?21 ' h!W3򶣼 z{yX3hcp|s .;m<Ù!`la,\-8 ѺK**] xHq^(+;VX/'HKCCd1!AZ`C+"&aD|&Hh /V_5iv.Őd &RDlJ\R*ϔ)Mh=j5"r,T =^PVU<>w-D+(}Σ-?OgOGx[~TVHLBJm(axѹڹ{Z3BY<1QqQ7QSgq|),3jH]=Ѡ3-4$O-ֺ Inek:{?Cåqo R!3)\ӻ?6=u4#,> ^5BQBxu0: UɗE<~٫Ӥ8LӣuN키'j.ҥ&vtRh I m#~_jlLJ!C(~RSak/QJiр5y~Wo[ Nm?ncS)MpZo䣮Gj3繮 \@)#LC(w?O?zuZh14VibXL)a@n{FJpg06Ef3vbE[Yr=wbGv ؒ)*XҚ )yL,kZ387YeE8)eW&sۚCPq\驽6ʎ6O!v+9lsh%?#?\O̧Aa's fV18+HbڱhC?;=I\"=Ai+wƒ! ,I)$&3&HZ^84.4{5%,vcakj[گU3tj?>Ob,GMtɅ?He("% d גHM͕b҅ZDq 8G-_l; iCZR2#{V&rߖ5QJ'^;rPm$pbl}gߓgQ,<sbthaAK'.rd-AH ļO-6!Jm="%&N`]T[o,i\hu,+EΎ.6ݣrxj4ϯhr;o/xf4,\k%@\Kdh'FB8-4 ^ ˌw44) 0Xd !ʫLr8$h.l T*Njʡ_NX؟{c]lK aKлL !$f9Ybfr9qS-M+kEEQih%oE9 T 's :mv\'u +HFQK I8rdIiYmEF.N!>z m=}a ~4ġLT*L{4s1tP#π\E}\2Fr̙!2hr WH~=&Op&fx9;/{.|HD3r߼Fxq}~&d)UBdNhE?^*֍4ĭI#Fƭ1`(2e%W_;M6=skGl󨋇\7g1X:.o\HX25Gc_*#~ȫASɖ3/5NӭկOϽq,Ѯ,_pqA_Y*,a?HUZ|9㼋ϵ*Q q?fnWwqFBr=\}u_oOh9y߯O޾yEk}/esh F>=x ?}hm M-V2n˸#5x<5*Xٜp޽lYomh$v{3 Y:$q6A8_\| d/ r;; R7}r:,O?Ap&oM&K=]FGk3v*sKz]m2'fo,N9ϻ}Vu8^K-x⾵ZCu͗T%gBwNNtsfYE?Zs`$A#Jt=Tɘ Pt<m9-l8s %u p:<$+KTHdH27EVuHFWʮ/j4eweQX\PE?UfË?~ѤzYq&5,s* qWO?U7z?nnU{i#+UJ zW3t"ٸ8cWxyrҷ9hdH[H m8"8=cF+?L{3E]s%77?azV }Gqvwwn@O1uxڜB2x93)*WnN-^KѠZ!t5;&:ƞzsr)Y;/7&EC[H*^MS5( ?%%p4mZǷ=<QmG0e'sY*[I![TpEpyڋ|'4Metr]J̶0 1ԩDGHtL.0'+S%1gH{zggݩhkMzۯJUƘ艓VkSUTdt-Z,PJT;ЪTt^w*>wb݄>-

" !WdEG@8@a!H!5&y]6F} Ÿ/54b !x0r2J(j!!Hc0`mB>oU#:Kn:E1j%LD.z&BEic$KZxU!}$5rk3:^s+_g[\rOYzuzӋ[,;! "-Y ǃS,0=Z ؠF !$PN/>^}w>TXaZjNvF0793;yy[:-t$,YsKY-oyJ'مsVr5~c^C@>obfGD$b|VlVLv7fbqL9HW:ymJ6Y!S6e}*3K0 [|抈? 8X⶙%RA︳  >vќ ,N;ffKxeEKr*&ic93rxJ"6E)&A-JmӐ ޴>Rioj"}1Dr+ w{s=c`1Sw6*rz9l,ɒ2jAr"sM,X`vrO#Vr+W}/!ѲG 8hb`L'"{ epI%$Ι2g\tShRH\զvt[ѳ/~V[Y7KmyAwP+սAsʳ 3(<4'l&H͙0tR6nh|;+ͶYmCiC h-h2DRL:d3Azh`.pCIǔO_qd*B>M )J>jù}E2q:>#]n k61;FOGf-/cf_pK<볒@+S6.{H,b1G[w DÔc:̻DoOw /fQjlr JЍ9{oɾP߾9/j)j2rHiTls:?+8e(Cdl8olL1!kȲqO(V0qR 5!>1L(,S%Y͓$2 \v4sPJ̅+Ȭ'*#j1h|l | !A5EWVW$;sg1eR2ѩ6U&.u)^眤P)TN`00˽*fu OfJ{̷i}-뭮3aL=en5WmENvKg>J Ek+I f釃D߼푭`2WJ0FHB(7hGPs^ߙ&,SddYH#)+M)2ZS Q$|.q[y88a-:2#+HLAU  v*Zù.0MǓ4xف6/_ J.E$&=4= <f b:Hx^Zki;l *`mMВ5`ud, :Vw.; .C_I9D!1}`" :Efec3Nb [GuV)=?pyF8]?ouU_wGÇ J ;LdgQHИ\E`B $½HBK2'`: ,wҀ&'%c=8Kt2o#Rd:(*m6Yd9?|ik,xRqHI"cԊ#FėIi_QbsNADf@4SE$NNj\i;[f=-p(`A}(m \}R !^xrtaYGꉋqER+L$`*pӜ# C ({Yd&tbV%1APi<`2s4(#q Rk7ݵ$&W{㼄(mAdZb~^O_M_ t]O~@A?cO?iϗl/ӣQ;N'+r_a 'm0GEJ?N}ϵ{!rҮ߇GoxzHw2%h<8o7{;,N8ɿ+N¬#"u NqʥyHXcukӞ6n=hc-KGT>򱗣-j:W]SD3n:Kg#O}DD-ڀR?0c;- н埓H[t9F8q}٧6wr ˗~K`џ |yw*}%^KOTO ǴOebfC?{sCXz87wP#~%MW0N7Q%=iY}U#jT\-:6i7]XTlA&vvQvz7e5m\,щKBjBkQ1u6ԠsWYY`{z3*D}gWJ3L@%WA@ ˺d@oL T@'t4) *BP`G '}\Y6nUڻW[.ˊpeL*wK8 97r£h|CҜs*چ$mذdLf{KM.)@~0}%-Dz=OJ-e-iYn=I4α B%З` 2g_wK(3DmQcLڃV:DG# 1'!82@ A#QR Zù7Z mI 2|/ t)+w&FԌOkx^i;].S,C0Ѣ:0BH1;("R]ґ&8߱@M8}*;G*6k^lCd %C6dGdAV5N5_J@-&HJzG@P6N5kut$4ANMGWK{hVK{Mͥuw7#^*qr4uJI4 c^em X yDhl'SlJ, }پ0_5-g̷0ȈKs--fe+"2(Y7\q&:k0þpG5 -wF'ɬ(QmrHU@BlH&r)> xz0KլC6j87PRYَN7–& gБՀ*fV3yНPf D&IY 賅Y%{QYƬ>F ,) ` \Iu  )4{X&lW> yZ?aa2׎6E)O(Wŗ)~h@i+j[E؂`} 2C!]^k]׼HǣFM-T= <e66h)2\}ەzT 6ߥլԻP뫇r;Ūn_6}smےkVё^w~C~M}589Ts#*9Lى 椕|0LI&P2\6ՠ%N{NA Qa.u_h;LōスxT,NsFTnQA&hEgőʑڑ̑k@0f tvV2ΐGX3[V%+d́pQ#y^Zy$C=2г\& pP`CeGnX=rBk%BiLk꼮)w7*=:r\@""e "w| ry,䑦iv-s[ &ZEC"#QeR\NV~ _3nΠ`1y߹)ކ{?c_5Ma!xR؋,هQ,_.Y_cb%% >%ԐaoSE#5UJN 鴐N 鴐N 鴐N 鴐[ {##pnu]maEv]}(v]maEv]maEv]maEضȧSaЄ=6ES{ J qjZJhH9pVTZM.贙GqLe9`շA@=חŸ̽ބ^QHEޖ*j~^:j ?^=9T\_BUL]"qGPgo$׫ ~vңW?7UuՐ*趝`rU*m${ uw3!-.c"BrNR)?O@*+XJ$3W &t K}XKii?d`*A(zI0 rA3(úVBp.o_Ie6)툠6 31}5vH. 3AL!b^p[L Sbyœ~*m'BfIDTae~vg.eOU\Ĉ&s!R@q>D}a?$<mnF.Xrf}En %P@EF\" [ ^QQ(T­OR[;X*7zb)U: gn댳f85c2(PR mWh%x+ p8MM+VX]a!uZܡWt3id_z[#l elEnQ a;4Gc}Q, [!e0h%N+m%'2ixƢZŝ炒b zAyoEBb:Q"u A!wVzo;[#gϨi:ѡf|t@VdYB $Qi"mR!A5h{hCveVw%_?m_bwgśYJ nQ7췘˗MaAOǗ0iޜj'iFcw DhJTui *Q͗V]m(*?GmoΊŴ,NՋ{ \A~WC5;COټ} n/!Ye@TkH3Րd$&ӑuBɳ=)%S2L!AyuqOM!Mo ;G{a`Wuꎫ6j:44;|aZU}L '5x{7oL*+Rld{jڳ oe}Id[א,G.5ν3(nU5غq:=_9~lb[ nݿ]{u;ϫ{r7?dSNs˻MgއqAޯzY OzXW|e}+nvksZFҰw%ڴtgiUaµ|MƠiӍ\؍?8'ZZ b:?I(J|S7}@4Qdi+wvͳ6{XTO0NiE`$Dċhmdpᜅӆh1(h(8q{c89"dnl>jKS$Mfgj5=j]ydLI?!qhC֪}b(:jS'm7roMsՄp* 8j ArSpP)jcq4(M4qL&d'ӶM5r* .sNXIX%*cJ2XπdpZC 3RfqQ2F:ay aA}wͧ|&yzBO_QTQiCJ%#,NY), c28)0c8Y)tZS ^6p#axz䆫AќjT'BѵﯿR ptle sdk497g#9Ooْ9J%)DJLI5%Nw/g_PyNG" RJ PHZw,m^ $ 3=9>zQht&Y*cv0sgkp-#) %EP$'5L$36,I=s:j]L= ^gP Y4XOq"*? M LbVضl i1)Cǐ` ߥ'x΍0LF./aR9PC|yH<$*ٓl^5 d AE9a]h9&O lm'pBPNXs6\z6'~8_<\t^J\fC!ຶPO bpf_.,jtg4N".腽Z>-ƣ KB`|R5}VxEbXſ'sPۂ z[Qf*qh8ul . ԎGUȩ_o?D!9oz?x}}Ooq9߯߼|/S9s ?> ?~hM[͍ghS/&|qu0ϷEmp*KA qhcmsMo ,cH/ivP,6ق_u%&E g G7 eB(߿LUb^,(R~sEs')O;vۓLIqr!Kuw4桔y&\$xtʍcSNjm{NX :Mz#r凗A<=rbA`A! yY] p~X+.`RbKlF7N&x𮘼^MotԺ-DͤW>#DUy1gLAQ~YPJUM_\C:"J=+j@P'QKJV5g;8us2NsR&\\[%)lK&$q3v%s;'e*ͯnx.m"LOa7{KFX*I*9c8UU(~rxZ3]֎oZPhvojM #Za4jFEXP !$*~dZNR+; VڠSw"nd߸ȘrJP,gAH#M!׬ S?Fţ#2-/H'Bvx c^""1\ K2Ҝ ET,WkX6ZYQ)G#He6iMP )ԋƵ8ǻY;;"; ux;mT۽UQ%[W-_J컋 lyB`T9X^QNǣAjke4x_&j)` s,Nv5YZ(lst8vې3?,fU-EVYst²NWiyUVF +P7@χc,h`)x1 B2*Z驎7=Թs .+/뼪|lC1?חPx::W7\s|ӫߪwj0(fڷ r]+A|ȦE 0ƲAq@&x=h7j_Tڲ ﳕv W'.`|wh0W2ӱjCEepIB!xeyiUPH$t۬k'6Iv(nI]f|t}~hb~M>=!yȉ@H#%:V-ʀF I)fz& 96Hn2=y$D !EMM4Dv̺,9̢ZwnG0↋y(]:ڶ2j{l ?"G&d+2&;n4i <1 A*,(DaV@21CEG aM " uTRT'}e<֝sK| b5q@gY37,'EPX FZL>]joM' h@3CtҜT 9I bFnJ8| 0<$.>. VzaJkE:ȽJk,x##rF?7D?>UяdoogO1.>w¿Go!,U]i "U}:Mu;&i;խW97VAOFGݙ?th.щ7'7(Q/ޕHvYqNrcTr46F][͠nOMUFo_u{_ukh8 *Ocg[tvvQoFxh)tUT/Wף$|]=.ƸHJ_wF_櫿X94&l&۔F`4 䶹++>fj>fzܰ%TIr&,TD0g-S0rQCj saA~Ab둕Aq5RPhu{pن(B!a2hmKsr~ZըV\L?iQ^\آ.&J@GZ0"(Mk@k\{]bLEHUᲚM )&+E qr4:f xBoTrCƨH0f[[ W3N>[y0.22cuyɂzKN T+z>y787=dxY]g<;ܗ zޖ5rjtMU ; H(@I)%Hޑ)D r/Cra=_d}DMQ^-(;dpH1tqῧ N{eX^y)[.;Y7[@6Ti?km.=T`x)+ ۔t&HMO)}F¤ٮ;zm(%,I69"2` Lں2cg%W6U㆘`Rfȱ$+;4F E V^7V(E}7uMbDVlj K>4el@8ol0U*2ݰDEϪqO՗_ t)9nPiP  ((4d4 4ϲn2$BIKB!o8 Bg.DE#H ugϢgcCn$NG~Trvosk{"T^\<!':Iɘ:L!GsFR @;ȨLPrʌ n`7pͺ&m)ֳCw}-+q\L)}ao|']v8OX~_UD޵tcpiJe-:DZ龚%kՑ,HJZd|OaS,?gPg&SV*L2ྤ>ZJj3*Y4A%3NXY.AJ%O٤ # : B&銀A򡩋0`pњ@dr'ECx)ļuf"KGhVWy[= /hS%&>7;.ۑ6YZ|Ѧ'(ю+Ԝ\RQ4wc"1ļ3O1h8@N!Y n=|Xy'sUCv6g*(o6<ۋO?EjwSpJy ?£*+yjܟH> -?ɸ_\ߑ04^~Hџi4^tEbvvF[_} ]v>=NH}`.;Do-`v m\}][]>I\%D8t+dKvjaJȖFա}w5`%n uSlg]U=͆q[FP-~]]_-:oV+@\C4?Zz?~|'}~5Y_ ҦH¢5&Jn"'l,&跗Kcͩ?vQ˿\;kZ_G')ug^ɪ_F?-7OR\ҋoOCvsŇv觓OolRwu "Qxaۍ|7[P"a}VZԱg]H-͗ /3^I_~OY#`2VZ]L姑vEY>3fjfg_SZȇ('MbwxqSl?befg}Zx0-܅ޕ GkxթٴmNWp6_hp7|.TX(V75,O4b hfbU7jkm;um66J'j ;Aw yGG'0zB<H6I83R;f&$CSec=$"*Ja23LȔHKA 4䋉)HNh0K@  9lNuڶ_~eycTO2kf^=3i5*8Szخkx)4dWi\B ˬ x17ey0C*ҶmH볞}ʵ%rĸYd@ЧC,JQ;Lrmuʵ0P.O&󃫝ntMLK?ӲL RCj 1lQLtML7YF8PAqc4S0ZT>%Z(s͏>}:Mj*O`a2=hbB*aOJIj9 Y0ÊXLyNOn{gQ1uFnw8>|o0}Da@cyJ'gZx8_І1d[!S(@B?=-w]ɺC{gARƀΰz R^-Q `dEjn.d lO9=Jmm.!(lFHOB#=Kpo8 0"g< 7>[=(eH%ŭ\UDne1Ei+jT2fC? (v[gwn"=cw݋N|g}]m& G҉TpM&Zc-Jubd"Df@eJ-@n}2l.pA9gdLHUFd '}Ca'b`ցpv1hsPe=Xh}bx&[,&Wͫ l Uhw~:W?i~>~S5WټzZq&5՟UY4cī%j$b\`)EI&Ō1/U+.F.7K%׷iu}Ұ^iX]oڝW/։EVm4j=bXƩCv=J5^uZF d\H?b:7 ^fNv)]Y.A C,1kRwNul(lE[s"ܜԷ߶5#,9ZĨ >]-ɝde^:?{hkRo(2 +#ąuMK5) ''5-a?'=̠;kfdvvw{y-UjagS=_.WY.KJ<`,$e2W> PL2o ʞ.&^3!Jm#IBaB` o Rj4 I'Kg09w뿿u_S:H '9Sʈ8gLX2Udd 33Pzɯq(}QT.ӶXȼ"GV@ɬ9J| L>ۧt[踨\Q7 2vD2ud" RM u1 HL9HC,e cG$c3 ;*_}Uk/]hW9ΫJ*g#5/Vk YQ8pjhC@Q+R@RfQxN(x/#ù%鈶7r9̷lHZԈv%-JH  vZV:Nʆ+ݖU$nU/kjOkuj/3ltOUw𸘵0.4{vZ=G\i0y,"5!".c|} jiꉝL/4yo!g!1 ڙhyi יiq'l ]oWeQt8۱JJfub1B&~$ً)hI&VBjQRǤ(*8hznHY)GT&7AlYP{#簠ERS1qo&>6r}#);޵׷ 伵 *yE;R[Y_6G-pp,Pjp^! Qֽ,Kd3 c!dtjFc!C`V{lʒX.h5H<`2ܐDr|&%շf썜Ú?қ.3 g]h]-͆拂w.wM\o_>z H29eVF@ qA6g2<j$ &H+e(ƞOHIHQԦ4 F%.A&ێYrLYw9tǢ{jm۳ֶ>7lU;E#F4ga6FYNFiB 71#x`b &2."I}1:߫Ft3f@mZI.( )DРdQɒ^uToȃs.$׆Qtu cW/o0GGYbiX:8T?0GDc s'-9*kcQBJ hj2!K2Ner\ax ޺Õ!yΰBXB]R"r8`y"z;"XbW;mw~dϫeڞW#tJA,Y3 +x]ٜgOv ̄ڂj'6a ںL!rEE?_̄irYH:EQO5 T66B_ZwN@ L^y0{"kB~O 2ZH2[1ZQ_(P6Y&j}Z Nt%Q=L-7wvgmwz9'ڇ>!%w;X;ݟ!r/i͔D-`Q_bڒMFbټk}rwݽxZIs`=]QؿtgMwwv7Ã/0;Ci{Q6|S \~`e%Mt~z=e`֬zխw-7}Xyϧt]i+^aOx$̪,A]zРriTZ28k%MuX#Ȉ:4Fvh8x vbJ]R!25Ø@RQDD_V%!T;1>1f]kc}W09C ƷCPCڛ(sibmJX> I* 4сbfqL9uWZƌ 6s6%:B Sc- KIZ|折? & cfX2L, \p.8!ȫڡ՛p2| iUp(BH1Qn +<5u+%ذ7WvOjy!76Cqs Gr i}ܺ_{rǑ2+mMm҉ׄrPm$pap}8 -_GTcT0 ] >q#alt#<1X6Oc,2^x-7F t>irό最%.C@ ,FEoh1FGwl۷[)&EIh+@E`*[ttoiE@CO@܇cN)fB}Wæut^etc<ryDyuㄶdm0FJ.k08}dK%+$7FhIԵ&ƣS = ]^V@sW7iyw^kkkwy{gb,&y?Ng hHx)]A%[Q\kvX)ϫ}?I5<F0*[QaIO*Z3j}yTCGrGWHHN/x?_|?x񒶑Ͽci#?~7t55ͷZZuuO\ }>y=/YNU!0@oijBOSFp$kLvA0 "QU6#+쥘BAuT:`'}jJԭ19"Y .{;Y9RdM&=@ʂ|8\hBkc Gè|A<^ Н ݑC9u7@ѯ+_:^꫱` %0FXfpRK2JC[k:Pjou9V3hI/%TҦ1) !* BeJI|HD4XH?Rې"xoc+D>Cx9F|SbD\߷z8K&-`Y>]e тFQ4`pHΌ"5gKhd ƣɨC{xMggn!meZηY8=^pgocQJuq{#p>3F&FOu60&4-Zo\T8 yAL}H :5DpCG,sK ƑL9x͗ ՝3.qݦf˅6l o½o:f6#z2u6Joiu']Ac"Zbz:?+ƙ AEE:xCQ,8p5VEmUVE=QՐ(/PIL+{g- ,MJp[ (v8yEl hН ނ*( K/h̝$Ws6J.?tfBVNu?~S4*bSƄ d/$Z { `J bZDxUbvRѧԕ(RN) C(-hNA89ʛ fnԝ\vn%pFm&3Y7L9{=bHT/"VFG"G+N]ERhϑqT.J`^><zNr;*oPEGҖQa ɉ5kրF@!Vh 5_rJHD``^vJZ k*7)C%_f $X)%)`iu%s֜$M. TRv@[ҭTڏ,x,G2>FT`883 /I6o!ĈI[G FKߊFJ&y6b&iXl#@ r<#- !0"4rgIKQV2%X<]j7F '<׃/c86#Eqy#("r.fRy/\l>ן`0bYEn?r65@Qnͮk77NQ}#3B<1pUaʛ8ɩR=H#&1H`&tY(6s<6Ҷm1{yZr~[:sFO ТMRL Ҭ.j)k1PXVZaKʇ,kn寃Gho1V ^4[%sԣ=k_n)n;nSm800cnB.>ҢJ]tlX;7Թ\p= `8V`^;,:H†s- 1hZDj<@kHHIqހEC%DhVLZh"MTR#*\RQ qif>֜-rS_>G5H67&Rc 8& !z,:ڴ,n Wz 938s)`r70uT!τȄa" KPִP+r^'zJrS=⋘Ґ(3)i,R`a#Mj3oGqRG{h♤lxgLi@aRzD(F($K1$o$̪Zn˷#mKɄ- LT/|*3:76N;$ ah5F" VFҢUd) 0zh 0"-{c+`x*ܑBv; z3N 3aEP(̬AH$ 56yc$H cC"}4HuH 9}S$9EGXM^sL(&%N Em:2,"#aREb"D46֜͐hU $RA.=*n> rGD(h"B'>67=x<7z/؈_,ƌYm!-SA[^PN'0[1;!'$qJ/Fָb8 Þ1}RCW,v}%΀2/:pU:b]"R/+.PV^3ՓwSig:fO>fTfD`]:{~p3gB nGjP՟o^o/:}\AΈT_wU@aRrpuF'L͓>c̱b2^oY1+nI3V0#*(`FBdjw1*$Z!n_9N[F=06$k/9vH(򅖜4 3|UF g<q&[s{pOLpO3R ?a"y+0no".7~szx.%[+ݛVö=Z)B >a2@DG>r&IZ,5gvstKf+(YFگ%`J=*VEN>0˫I8 vw=:_+xhJa2W-tRirRU X( ǞX,f";P eHt:=~ <M^QJkG;(K-ΦWկ{]\f;4ƒ n&ՆS⥢Ya96S&a0];}w5a@VbBlxkKc'΃]r#Gv_{[<ÙWb_D<0d-Z(1s ]mI%$IdxA+L `s1Qeȵ:[6(k0fs6pqo[/3Dآu4=}n~.!̜u!qHk2^>#oGHOHڤl(EU▕X9P՚#ϱRϰ6;VA OgM}-wc乮nx;"Up\@ ] ȗ!i=ZK-c LLj()B"Q ⓊEEho01[YVªv$SNl :TUf 4g$2fx3! ]Q5%DRP8ZjsAuɑWm S,XˈohlմIX rLa- ic,|ZÀC*eq8-gҎHTVW&7UaIcPycdm!b(EH.9msEΡ]T%cT2@j) ")zgBO[TuLr抩ɔ lڄX٠߼S1@:9.g !./ve" P }+)f 14yjƐ)"fG̨5!vAd I(oL+]"VVr VQ'Ӛ zo]B uF%JE(UjV1(h+ʗho A"R++Qen0ui)YbԖy@6 A %؎qFa9kR-nne/Bڕc;03f̪(Nu)*)_,K !8G#^dߧwvlL|ХYK.d\*MNzDղpئ'!`|(-Wq #/ f@(2YWҥ`ruA"ܺJ78bgP찰ڢR :/QhJPde+ 탕EJôJ燅Uq@)h^\@&6X/x›q+*p=%,:;**@ɉU;kVpۂn2З$ ` vo^pcŝkGumR ("M'PBC[>*WpBl}z /룫0 FYuE_Hє>!,C>jඝ*(S`a @p D>\;$jy=*f@kFwQi< k I \9viEpQL;vM)@t e?CƠ#RFdp0",0 E*C A3ַgZA.O 騚:K4YtJ Ԁ,yhmJUΪr^=+렰^@̤o[! "}vt*2cŐ%V & kai3+ջ)N6TukriIm}u`BMW-d0L&Sns`*@,E2 cԺ@!58EMy(Y5X`4h;v =A[C{o>*5ks8h/Q7C|̨py@%\rԠDKhLT@=B H>Jk$=|P@z'ƭz-2' YQW*ϕ+E3)ķ(N:mz 0^ة !Z[cx)U,Jj͈OZUw3~ aQ`Gϊ=0`ci Fj F5j*\#|r,%&}*@$4. Z`fAp^)~xYk$}C(U\@贑aaXviYɌB3^ԀF2^O#0%7aDF=>d8clYXfT 9$bUtMD|٥P#HhQ.BGCdN]hE) Z 4rMҺ5]w*;c70&0@5aKBo~-nNo׶*eO;siZ.@aƵ 9|~A6XSY.aRZvmiqޯnxh={ݾ{{;&&gFaA\Î iL H1>&ۑ NY[Xh DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@zL 'ȪpL0%3عWf%fu@I+0Rr{01f)n"&1>ZputL8Gz@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b}L eB q8&U3 V@qLj DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@|@\- >aA\Ꮖ i|L Hi%1>C&g"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b@"&1 DL b}FLK9lz+jKMۨe}qbA/0TJO~G?eXsS*7?5Aӫ&,GL^]b_Rg44!*0Emfߕ"Mt=y恧?ͮOpwiC\Z"ʳs_Z0A^o|_L0׏!=' d^# {Tɿ-?~\ _PMy3vGw_#(e!}4"q5޽o ԭs}} ;"hh,0pn_( |0>]`rۚOuN$::/X]n1:m+y勡~ȻÕ u;K>!8H/VPsjNjԦLetͩƹ{e#w˥>EbM_amŘCRRLւqO F+4~c0*+fCq>{_ >IGDKyþZ;رwGo;x WX! eo'dzU>ޫǃk;[`}aB}$s|Ե7k̔K+ܽ qͩd|-W¸Obr;敟^LH_6#*F}ewɕ+sT,cNms@=pYy9VV_6,̇F|7,y3F<4Ը~4.-$[{z~i$O4<6u|$~kw,3#{cj16&{XY~Hxn`᯶[dR'g;FlѿT~ѿ}K`)Le{P{@QxWI)eYeȲo{be/sjG9w2ys4u<7Q~~fcĈ$R^̐9˲bV#0\[/|N4;iYmbx4nh17Uvwg-vH;ޝv$YjUrm:Jj*([Kܟ=~u;o$r{,p4w7wCǫOsMmz0{p v|\K_",ח\/$7ԷH)baNDm_}o9gU/98eoi׳NǫdY%@VQ*oK!zƌƁZbĜŴbj$_6!&0 /DhyBiDa)1b+Sܷ9W>zVl?E#Y8֧$Ʃq&]J&baI!BV'>Hahp nbij(aHf'^lwM xZ<տ[>Lx-|p/k̍t?o|U站+ƹW@=LdR<:a(\tN:!U<[נ) FCD L|h+YBu߶V"i/j?3Oshk ,_w=g([hO]6R1GrC ݓmF ^j,D!h,=cs8LIe#$(ĸDw3 aR*0,`# p `奦0|<) NgFنAN GZ I05j9qsG~h1(q̚ZS ͢2xIli`8G]TlP))OkJ73;5f k;&4O.fzʹ&$eO4:ÓYvƛrQKfxpeZ-?|ۇ7-w0UGuSGHuW& ug׷qt'e@x2W2`kUsP5KF-rGU5U5kJ/Et{}.Іq])-FδP$S?/,m1I?t~uMdzT;N:=->̺=黿YbتX {I|gx~Os=G[˯0Xy_ξ3L-~+N=i3ɧO75\Ԭq CNue38|eL瑓4EY.352㫟w0w18D:&͑je?is|1`ՁPQzUԶqkЬ?rL]e ၇TQ`VU74LO{63hwSުDŽ7S1Q㲋D{qÅfeOn " 6٣& :P%,(P0`̛h1ThjP8hzt8'__Re _9ҡZ$iC)fp,FR:b$ '}5zY^@/*!M6Ӷyuڞ,#.Jnb/9&r6"6bEw9l1V"V?"D4vxevHjN  Q-m@Ah_[BG Fivssۃsw {n5{xh i/ Ptjlaρ5gjސN%tUMw e?|iJ*O{r{yVqu^8~yrȱZʄ2a^!UoLjnb 3ꄷszKK#t2lO wm)̑(̏`ڛ XykR4laR1JL:K۟aN[fG;03Siumg|;*73v"]YYƟuqp;Zl @PF39}\ABA[t.`aR)rDc-|`ooп$H9LH`9% w徿 6tz4Uw&.!߇2Y>riW?}7~Chf wo?^o9s9Zmq&%d+U0Ok4.z 1fT/8`$HSk!U[xQfDcs#,іȈ Fxa$ \3i#.D\.,+BM "6fjX[f~QEmeBZcJswSL`к} 8 !-Id(,cZ/")Ymg ?[y3S  GΨP R!<1H愥 \Z*ۊc8 n' mL5-ulmZp0n.}G( rILqb`2 )HʍCTe`NcXhm QFGR*:d2kpl8`0 rqկ=nY)JFWe%ňhu_M~(30Ga ڡ"bmRrZU ENkc[x Q_Pz*qϱgLVH(?FJe$] oG+% ƞI6a :Џj0E2Ҳ~3HF6S-jf]SUUuuD& X4rqYS\P21CrQJ ༷w"!1Z(O;+ҞqggtPC;c#ǣC(lȲ>‚ E BcO-(񱢬)ޅaa4>-jPZreqoQAjr\MB@ 1~8/N&i W DhJ4ui Q z9p'QzhWUpZ̧e~ƼŬ( \A [j~Ż"P9$y> 7mbdnWMߐ~tMQ I&-OG-&;zrqAb?yoif[Nhf Q}[gۑ>XqN'Qe7A&U3|ʻI6X:xM[hpC&k{r|R}CXrjlzINƬw|ZYllΝk /PCIy%_;rof{-g>⫻x[eBd^}N\Ƚ6E]bTM"Щ`73 OB!ŵ7ĥ31hLAGL>yo ,$,^GH%bhg@ 8K! =(JdrayaA{7Wf~AhEh3J0:*)L;e "8! O(f ˻<%e&}))TR Ԣ?w<\뀻c3ԥps1ysnNr#w*J%)DJLI5%Nqw=rvpyB,Sl.Ra G4:q mz0>vrE7EQN"8$,lqN{Z{Be(<WhBHå;_uPU ,q rv І‰dMFȹ (.h6' +o0&ZXaꔱJB6QfR&Y>NG" RJ PHZoo  &Є$ TL@9ϱˈBD3R#eAVvswyƼF0 2PBP@rQTH2nPT36|Iu uןM@1 ׅzaTظflc۲'Nm &:}?ޅ, $NsI(M0S">mBK::~*g=jNhrzc;׊sCU\u{o8˘.r *2Nof3^ټNY" i?F.B,\1Cu|pf_-==99^oB- BSeB/ mp%&g7PO[c?]4gYy]x9^/Zg+ɲ }Ӡ~-7s AV4 ^_]br0|!YS[ٓ{\ [ߍ]?efya8uT>2?\2ۉn_ ڵ2p nV9ZKdFUR=ǴGQzQN0nǗ]j7]7m~^2cۯ8GeR_a}:J=0TK3]=w}y?>{3\F^ϫ_Zxe !'w^GO&O#@?{]OwiW]cm܈m]z>~/Z7.cB b݀@g{1ģ$'׈$ 9T$eW qp)^ 0IKH+2Q( Ȍu+MJNPyi˵)I"i%Y4Ad4]e!y[NK#Rh8$ QT4|She htEM}"Mo"[ˢN]uQq@ yQ(P]],DrpUUn8ry񢠄K7 qTTqA!ծo-Z.)2&*=,!YFHS$5@ďQ$z tvw/H'B4$D@{Ђ-vDɨh:sﱛ;ˀpjexmTUhf ?V(+X\]+T^Q6Nrwuy7 GW5LJ כ`{o4/B# h)dWnC-ES+Ѧ6KR}&H97 عجnP]faUt:P9}\ҥP^Y^ZA2IoMOlNt_׎2K/".l`+DVԤ#@B>_I)(*U:Ե{|]+aQ{T\Y j )cI*HB ZhD/9DR0.LQ KAe;U Y)cD#Mh rb  jglT8dƛ ,rnp3>mTWr_w_zV"9NFSJLSXީTAs`,pK:ȃT3*SB U1YϢ$$ !*P".Z3vFfVLvBݱ.T]V,#$57X]Wa:O~@YO/=8>h2RPsѠ$:)#+F#x= O!{."LB%YV\頄W"jKmGMLXDX*H$hӱۏqw쪵MZ>zgs?Y 1i RdVDoCQpI#2p#y$vIZȈ @+:pШk3P1`&\w) "Ds٬k>Mb4A#)4bRxQYPfܩ=w1R!x]F`< v[s A NmRKY@Q(MhI3/NS]錜-G]jz*µWήdG(IzA/>񠍕pQ: dRoF#lBj GAЃ^|/3}8oTXn E/npY[m=Y#& ;D?v~=ʵZJ6M7rG/8a}7LBե91ºXM]T%*Ѱ^>iIP#XDhP{ҜK#)i^?b`2!EgOTM|Bqu"͒gQ] m7#IïBj|0[5ͮB-Bax]ddlvÖblȈ?] jTe3Em33<継s[ʵ -4}0E!,*hyg' A~ !쌏] ZL;Ѻ!%bLḙL|PXc5MAM8ˈ޿}QBHrzr*ɸOߟnʊv=6xeЄrrV)T,}c̨;%Ha(;_roc0cԾgm;su5Q]\2:GW]IWsث),K/3^[~rKwCgE/:[Xx qcdʣ#I^J,/y3nOQ t"+t&y}蒛,.>&G7yO_qgp8{sXS50։ΒNS9x?:g,?&ٵ g>Mh^/Y 8L3#y@rma~BL8O6݅OO3S>0[r3CYc七OcKʄJZ[Uer4NdNg I"666^>9KCm]z k'oW|W%px\ۻ&9Փ,BD]zF}9&L IXmJYm v'~GS624-j^f"UF;wsUj5.c켨.׿ӎS-h~kHovV[!hCeY48E>nT&5_vNkѵAǨͦ\N.!|wkKJ;߭RxSU &K3mDۤIM#$\KSmLb2cRN›1-.ot.GSD pzGma-&y/:dY=ZǬJ; Fix9w "L0rih*S=^bx6;[_:GBFoa.FĠ}^7,HUtܺ:fxyF9Yr6'`97'M4ee:s!$+F[tqvNJ„zs*ÜDi;z7Eע$S&mI!a##+$&E)k#JK|s;Fw$ԳŢKS}h5ST:8S׾"i4"f;xpN+ ]+" =5ڐ]jy;"4#H2](OD=elE>g'`b {'0O4v/uh;۴V( I&jeW+cAme±&6J2qF9WiIi^{X|/b ~NcธO۪npWO+5+0" v\id9vXd*zbJe5͐jPoB+"X`ܠQ @rNc5BȂA mG HMwآDy>UFJ[ AN98f`&AH)b4 Z(CYLu>J@Z|7ud&t/Q*ACmj(PHqtj`Wf֞a3;^vϪ J "F_R]5=#i̼B~իf !ѿ&DOwdJ9MȲZPҰ*4OX=sN །L X3RTY7ICQŘۯ=("_ >ìr6}7ϸ]Z01սu"AAB"iIZBT8t>o:@GV+&{H\!iR|tLECNZj,J pΠmgsm5H*xTHT—Pw@10 1Bk;3SQAc,C)9kD*ZuIBvp 䁀:t kR([|=L6$BjF,l[viXi,%3&hcp%l@M>%La ڪmryXq=]n cZ:]\GIƴjD0uҍJD4IҢJ$v Mw'QEŨն[SQ5(U/ yHY=yh4:16vS~ӣa#·UfĞT2=n(!/[tCۚb斎T .܈ܢzI,WR SA :(XfhTY")]XoެGŰh슥Md >]yE!8ij)ڥA$\s)֛xyŰ aLO(ƈ0R܌EEH,{awushp?{$mDVc-S{KS@+; fnfZHk֬Ug(j̤yɜfj)څ'k=DT赫}N1j-*awU\@  g` ae+J F̀| =ʤ蹐fhJnÌ 'k|Tp(g=zMW: Fe(6tqHp h .* 6fTS.*w3. L/ caVCMUp*tBN Zf LBFmT'3tECPޙGD'gAA(\E ;'kqڞRyP:lrKr/}>h*>ELm, Lki'H=}糠ÆTE]"*3$5ٓ@ &>CbI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b%\=f[v"sΐ@dQϝ"+cs$bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &b3&TpEwRagH X@d3>K&bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &bI &|I C2w")w2fgH XgJgI gL1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 $@L1 @G\Z9qy}[_\\P VMs#!%r $O;O5|x/ry\k[ٸr/{t]T99>?yӾZηziX ;V_߶jX#7 Fd՞#rX·`kA ^n$K%#WtBS|w[zjUr,~ȞG_p|o;ԯMrh8tEyܷ(!6i6 dJ:jyTM QߎpثƗW]H۪1rVc |3p3\[9<戬G|bσI (@5ejd!yd?NMȫA[YrFlj0Z!zkB%i 6a)p=+T޾YeЬ5uɈ(Zak);=׳(k3.)&Qm U3m\3k^WJLCxe7r⭷9K~899s*k\I{FPM1J\}=gX/U1׾7k%r!;-\Hr?yzA#ME|Ƽ]Hf']6jVn;Woe݅cvE=M߭A} KǺޒl86쇆RE0<<$C{t#QPZoLg֍9yf]6ٸWxdw$0iiІoh`%}dԱr~WdFFMZϺب1AƝQ 3cy r{SZm2Fqi&l-ޕw3zgN}c\O޴j h >wt'ŗ8{Kw.{ 0-\GG:GB4jn7:Vk>Q %pMpN'v dGuE]18a!ԗ|8g_3ϗV|:;#ĸ7]i77y!Z?n2y{u/'o֭tfMo0K+knI)z赼;}D_;P("MRIЪ"6eeef屿BlmeWz&K)!&+:&C?*c'&`€wyHyETB1<<-`0,_BXi-xi89Pڎk O9?npg'N!Dlvd4=3;QΌ]4JvLMqXN DƤp+W7(e S'<ɒgk8xi)L<_,qpA<-用C\U]0alb1,Y>B$D|o.擼3 ^hhJ)Fpj7_az]]{yL/]ȫM=@`ߛ&"2RaI[F]vw*KY0GD~JS{_$M{ ֳwl>}FCD3A2 %HhaRm}w-ٸyf庼bD$i:FRӱlN`푍#TbSɺZ54ѓeUJyJՌY6mVnu`c?'60JoP ޴Ѡ|U yK<&j_YR ʨ_g^ݫwqLCtVbSe {H1T໫[0ʔ1@->GҤ 6\a JvKxѡ%2}34:;7HKSt{h\ #Hȼ :FTĦp}6ieRiGaT F)קZa R2U*T ݢX a@.UHT,4<,2]@R(jwOw Ys +Xhpվ>w)#0Rf^ (QT9# }|N}FZ~RM^əs'* p9s32]5kaSycEh A5a)FOʢyPR.<^[|HWsʡTӋCN3Ay-4U蒬&#\U/cW^b`}486$U'vlz%T!SF!a4grA)l >=/2;E2['ne쫟5x-+V\{= ВsnE!0JC6ޭӐʬa/Aɘ^SfDAQVwwiZƔ]NFp<̪.&X "Q\ j!.*)3kg6k)j`3IPUAjaνW؊ڤ'b5|LϤx%GzF7?$?lC&xKډ-Uk쭅hˀCsXYK)q(BR0I0"IFDWRt+Ĝ ๼3f[bf&P˭/d)ZS{70V> !"E4d'\!1- 9Sh,i\^TҗLVm9ά?>?+NR;^dr6W2:g_my;DA X0S F2M F|>KJ;s*!~3ˢkQw5n{ZFèYcP`E(0)?b}F|.w8 b_Nyӓdx 7 ^Ngzli:2&q2oÿͧO#.TxjeIp(n6"uN <;:x̗=rfšz|bH6)6ծ_ͣ8ֱ{bE&KÃa-/Ξ]%tZG֒͗znA NR |8)of |_?ؐsFd6XGKR?ގQ}yL~ R#t8T$cdց7H]rJvWP08Jt[ yB?TGYvv!ѐ咬+UX'7'xs&Y6P[{+gV-(tXIuWú&d՞(篴o~CImKehTp4$TD޺-˿g{bE'*(H'r*fäA:'zٛ-T Yt]ůUJ&DD·' wNDi&:2m9Gq1bi(tzƎΖS)]C<=>J5uTs.dh=J oa [\9!wq `B\(0w%0 ~ǀ_|1}Q q!`϶O;2,«a2O7bTntdQ 1gtiopۻdޅꛭNJ @FN :ζ )渏tGS7{k_z֕51xy4f{v\:*u 1Q$r΅8!uǑ>cE#94*@y&tq"XBoΡd:vfk^VڪrcpՆV+ .т>y)U+ kOoҢ Z(~ xTzAVN$)r{9TsBy!TKຩS[ `֕{V%q,bs'.zR*p%ρ:-pR^%Q~A: }p^ J1Bׇ26]s J% qt}x*@H)9RkCG1`]:hdhJ Bpt1Ix_X6X,%WD *p!EaL;ÏVf\zk${Z1,Z2W wf\oŷ3G{*BJĀrDH(/ \an\Kd6y24_C悖Vw;բ0V\#x RF"Y,"$\Nm~բ[B18$-ahs}LÂ|:Gʂ,kIN}j:γO2;!XFsYˏvw!sJ3LcDpd(J53QE kUbZ KQ(1ז{=ÙYB snId_iQjr]VCﵪh6 PCԏD̽?2z0fED_qz2ڼv[LQ Q',ks?sWhWv3 1RECf`.+E11y}u*uEIJ:'z8x|`4\eqVxWgwKC(c|7nrʮw5rN]%ۘep b.cyOѮ_-"wK8r8aEle hN 竟k xrvqKj+7/;'. b>'Z]q8cy #3l?SZDY9J1JybD9GHl&ZjR7T@wƷ"Bh7йH+۬ Mi wR]J®90JE EresۘÍ ~%X¨HeTdUa̮FGImhrclgL'hV. sësqMP>YJ%SFÔn=3WWtɬ񩬱q! N=ua٠p0t*@2=V;˅^OE>o2D"q%(把B2#eq^vcm.I8N# ?Dm3ieb7(9OeѓKtId3&x3*s34B9ϭ6QD4tn3ClӆgrCU=r8ٓ*ܰwvɻȋISnc`Jrm<5NѫP۪%00 ͮqj4{fA'vq̽PP^Re>`*1q]ɧ6i% cwX]|.6,RTuX*?D]Mg:`p;_R4Ck Ə "זlngojdB( sܛͰNEjzS;Jg$Xcڐ2Lع9MH=^7GU V9vf&#Z0_* aG0"By(Կ)7r#7F& * W՜,}qĔ$ UNmeK_[Ͻt j4ms^w/n([)S 3ʵ/B<',Q7m+su'EX)pfuקYu7*j&e8-ߌO;HX\KhuBKݑKG" mc֮_\$z8%דUb8*'t4m"'aG!]:ErME43z ߆diwiT4`NqG&vaII~Q\hFf2a +4=d9X{iA>i>jTKMӘ~(T"3e@͇a.EHj0].AdZUcC,n`uYR\8:?ZȶkctHQ}U0z.>KI5e?kJJ>WyEz<-bů"cF7I4giZFHYpp!Lѩ,Hվᡒtxȁ X h [%(z;]=,fTMs`57?F/VX ҋ3V'Jݖ/ ʥc$J*oaXR U^ƅ^^ucxX {df4*PITډWa\rۊmlGpMin{90:b:sSPgNd=j6r KF},YmPm9.g Bb0攚~s"Rq˥1{_PE@<,|`̳$]7EaO 팓|k:.*rH aRO-,+޵Ugi4U2ze~C 1kOuY9"Hcr:q>V/Ӣ$t FJ%ֿ6A#mt&leTpdgdoJM?Dʼn2N'{x}[]>3 ^yg%񩜆LeXlyHF0'b;ff]2r)YŬ6.ocr:g'w+#T0v$8P0IÇ1R#fͦ{PhgRJ|vi"l%p-E^B5͙u @U`(FD4u $,yJQdqb\s,ypة)dž5ZEV"T V= ́ږlIgTI 1!%b@yN"AeQyn[?aL-<dɞԦRsN9l^s#s#ljV2ϥռ q ;j:'-چZ0 ҡ e8)|_ld7Ys{ҢNiܓv\G2TYL2o]n{ øiMcx'Եcw|n_8E^_C5 |}3j Rn/>`O`zd#4 礀 RA&3]&5ڑw)%$$ b{wNGHWs;$Ǝ5,Ʃ.onr6SAT 1hAa -A0*"Fy=5r,/Ԥ($UCE@TЎYKb/Q7kՂ +)uG4bOȨkYmײ;ukE5)G68d_;1sp}#խtn祎v:[mr0,rv?q7 >3դZ仪Nո'/$RVl nAPY•cbK F{n9iD fSo_kRQ̂4RIwiF[OveSA~Tp_!8:N>sbLIZc9?M%M_ķNCXA>;M\Bgjj/P8|<ɞJBY@a]VΈiQA fKrvʴLG42IҸ47s +7UT*4'ЬVNFs3 *$JW3~e 9+Wb.8NS3{7SVaނbTM}+dK MBγeK7 tw.$um; ݅pXD=45F(\4ϡ% \\J%i4 :w|Km1Jk6]}M(dg 8̪Pc{s*H톜$ svZޣMrl<d]D⧫g Oso9~;yzLWߥ XJ%&d桹=U~2̳F3k{a>ܸDO'g< Ow Y3xSCu=v!9E(E1#&2@)jzVMY{bEw)4A;KNfZ}Bg JB :΢Cpq<2i/.Ąkcx%Vf(vY8 =&H;ѮRF *PVؗ94XAGjֵ/w Bbh 7BВ.`oO1JpXj1ZfQ'\Ƚ|6uS+$?.Vrp')t+sPV!.[IJrnN6w3ÓLDR cJp4۵vt ^ьm9oco\HfcN9Ah%4 }4a$˦^fU7erճW;It } u$- S &+"`2{ #*vQ!l SR{aW0{G F6c@jw u;=CR`.mDZ"Eѻ裕Nf(n }BH+J3oC| s6_R2 %$Vp 6^ h.(fTbk!ZRkSTçzHĺ[MƮΣH @'x aEʓ2w%UpYfӺ|ptw"lbwz#iW zo1auU9E"{WC/q`ɧiAuf#9 ?7:DpY0LQjRIKA*v{[Ueg`_'Ŭ.uP#H(c\b']i|`2޷'805u=:/*(KcysIԴg]pPEJt`v6V7Ɠd0Nϗ _]\fyh0@;)1>8G@nz0~%sz{7:sևro긯WӇx=#ͦeQ*- ˑW>ԣ+%RzVS1 WIFYů7xXER@RPVQ_2o(vB\%ck]\A4D,nX̣D昩cs{Ժ=uy!Br0\'pA ps?<7o7teQO ˜r?!9~=`x^ÙP,d%3!9ŎKZt+QJl @F`" VR>+gC l&+W[a0 l3Xx1G BR:r8fWzrNQ*0/teB+{d)v6VU%0ñ}wg,U.:Zȍt/|TcxְX+$c/ņ򟧧i)aN)E |&%7i(Y6Zp˻ ylBz;% /u;N5"7=Wo ,&  ޽7,[W칳vQa<_vk"D+k)w {NU[wǕZS;>u#?!Lzg7 ҅r3u+9>f)]crx{g\EbpctbFt7%lAc;a)-d&0ƒDm#؄ Q2byZ=#ˑ "+/~_@: u&@ kuJ;[5Ed~z띕&e'Dsa_.3I%9,Vtv5쇳ӂo2O~kc{XwQŬtIbZfႪQYQuogN <<^3S-$ {QHKbS x kP$=4Y1Rr֛|6qIggej3NjKy 32ad$WB K,*\P뮭41|!^E4a8DžĖ4Rsa+lFq8{[pV4d1$`˗{/cQ[ -Q~zgϳL–>OBQ,Wήyvѳr؛qo;)/)pܢ?? 0W((-:ߢm=Kgxq?}%Q-MA|Lȓ@0=\'c|IL4c 0܀Ioxe!I0K:{&bdUԺ+SR0L4-M]gaO!j\`7 %¸|]UJY\x;.u"m FVa#tNmJ0E&SW9,M/7^8cUD-#d5[TT{%; QcK )/*G TJ#a,.!led'=f?Z AZ(ԽkoHj_V%=,6w[Gi̤Z#")s*i8ėԖ- |3Q`ǽ:At,)V<5_%ƌzhJǭPJ;>&T˚ bcFmaUyaU)BP.%5# R O,xlKFX6puvB[8 REk[YVSnXd@d2EvCb TƒVe.9BHčhi*( #oE̼q GN zՍ}逜LwAVVޕl"S2k_=$Ȍ$]K˒H"iSVS$E5EX^Hm̃tק+!. 8)4Ne+ƺE-HR6-.=(~,@S=bf鬂 + 1aD ~6 )KOp* G>h8xn wufwtZ3r+{WF1z>oA BMR9xCZ hbOzqTykndXxqg%Ҟ~ʵ (zK!Bұ+'HK X"pބnb_KAJْյ/UĴ|+FfaZDpe0$g.mc'I&%[HW7Jd Rb`r*E1 7eI"Ƣ{C%N"'ݲ?D,!ޏ.Ǣ-vvn"qBnq|./w wtWzrQdDk.L!@MC*)EmJT=x^#۞r\a(pS018_cdsxfŵ Mg+#A?HZY `xV9Z 9R!--8S*PIS{-cwS,S 1U/̍V7E3bgaOkakF(x3-fnXPax=uJ2Q͡Y6DܗM)v4Rc+^w7{-l5Loe53DuRmnZR 0wht?^W,lG0~޸ E+-\!{^-i/AfQcnDDk{5녰fi(9m)S0tvl^24 /OU 2i|1\vU@sMҰ,.|8nqc1Bx/ 3EsV /M9KT.=qh{ Ê(>a9ܾV(ѓbt!EnyhW֠Uvx-MR.3'2Xab.gM[Y+P9%KZ 1Tb=N"hNZj䇍6a>>'hkMxo\0BxkH RQMi2xkN]T;gc.-v0"3MS}SHẚ`!dI^gJJ L4rUJ*qvRuD0I:]R!)}U|p. FdPrr-cz`Pd (azk `,q6r)2kȰ8aR,wx=9$zgxMdJJDƙ jNvM¬aDY#xBn͋?ns}.ţkx)q;>nheSٲבo?ZbPYyYܜqwzէYyd྾^QL% 1D#S_W_|5oO*-{S f~Uy/!-*~M`1?/jVj=ݷMyymg7_7^9zZ:Kej-\ ᝮۅ)?33#Dk}dȸ@ǫi:br;^w*k2Mw):msp#&TRyHSOf)(Y: :nސC0!5DtYwgjcPt˞ 0OYIʹOK3L7Hْq9յ 9b?ď Fz%T"FU4{j>Hc0m4SgަQq3)rX۳Ⱦ ΏJ\EAED]BZ;G%2^.ˑfY7]պ}b᤻}+C$dͼc$-T|YTݶ $"tmG>^wiϋݡ!k;RsʐR:gb7p^lBԲ;9,Rĕ^>$ՃM;aS:)T. )dX3@|#POGV9\N˨1G,~ !g)cG LOn*r5I2Hb%Ll׮Yܟ7hΝ魎9ozt"T x8(]Z] ~tͦ?\I>˚&m_JXJ2P_'W!߷7714I8Xdo/&4FߑH `C9>ΘԸ!($" ĭ{>ow&Ӡ'#휌T8si'h%#bQ _"BNzuspJ$qH$0F hdIQb?؝4y˚qSblm']7Z0xo˲w/i30? Ȯ7|6b?]#0uӤ֯ ,b%1n3r7'9#VÌbxgߪD>$L9ّM̅d+#מBIs!__B̞~癐)BLleLmՐCgyve9)1-&~ !F6~,$ئ3}KRA8f2]Wߘ´T"#1.EX<]^>WD3>>{FCt-Wo Ai ?Sҫ@\\PAk5R ,)yft[t`,AR .縏<Rz񏟣Rk8In~Gx^פ*X_Pvɽf%@kx*=b)@$\4PS+ybI YV pQOK>ML"8gH R}J3o_eim Q~GUjRKQLy7cpWh}8D+RKo9b-b=bBsFejvҦY9)C删.}e4hBkk3wEGᑜڻ I@%2κ z:QR&O25dh9*)T%/"aRrcuIv}#,1op@MkOOUp$RLӎIӭ}ذA1Mu[}YaNabඟ\iϮKOupk$V Kٸ5^ܴUe&p40!(iFa07ܴb>b8_+0Uaf8G|<Υ6 HUwr2cg_vsN\L~/7%Y{/)RQ_(o3w7JOx33DB8F8NgӷNo~vtZZZbuthŗԊ%6lt~8ߎtkthҸ*Qt4?x6uGeӮ1в *;uk0.{C&Q>C2NƵ7F|ã-Y1Ƌor;cG 1 yjX?gd u I56 =Hn(FdI; '_ ,o}vyvv>ǽb=S_egg ېp 19Sp'DG}R*(o貕 wuy;>9x/DLӪǃնqa7y\=t䊮{_xC&|nA' t E[8=yhyYUy2wm]*f^ARVR8ʳ@ ! Y`+ZnZRQ:S&uSQ-懣a =ġLgZ0K}Ij"" 9B:\ ͊:`={.&U#SI-nɴ®:CߑI1~Y"EP&;O\bѕ *tcrucXZAΝG70#e?+9:9TtQ!"q \odQ*DYȏVMuH:h&tgVG\{ge/i}B/4I-c&c4B$J G-!$fոu%)A_T kW 99Z $,դcwm(CQ_ z# JLY{202 |b0"5 ~KqPW_a ߠ ArDy-毈iQI@T;LB(5]<ɴt]EMvG6Gߢ( 8Ӹ2֑wN~LE>?:n3;Fs BPA Zg.w&C fb1Z"Bp)$?a5RQc{mn ry )(JIcD(Ո֠fkPca kiPz;8R-.V ¥p):\MJUl_{z爑|ܯ3FK_N=CT*i羡pxyU*b峲>BʹX520h20,6?N]o2Õ`ݕPrWSX7?/BMq9JiTyV7{Z} H ̾Swz&Lcrh__DRAlŅTz|y/K5eh1Jd̢G /C]O;u(ۀ)&G1-;ÊͯDj{1mֈ!Z$yձ˯ގWTc7ocY#󇽷{DrG#H !-,!;X(RyĻ#4cçkW>R+B-{Dk$L AU9C?O@#{ ?bABc<~~k9<ݾ[1ݟ]伩(mTq4C`(R`* r"yQ%$H!X8Р@DHQ-W8Gć)͉< fP)6aD_6pa/C}q\"Џ!x ,)Wu'ȭܫnN+(AO`!Z_K0* Qq&eƸB(R43jpi n!q.Wӓ;j; 9ΤҿHCJ>F%m#ǵUJ_C6&r&ΰsw.P&|;3tn\ -tnouطq2q6PlXv+8m?dtԢ-~=1(h#뒳Xc4;X4$iDjT;ڊJMg'g6 mH2!-&GM1ҭV^dYC͸1?^.^ј_M5 D!pO1,G[c 'CA+*BbhU5<1u 1)PA!' #-]X,rO._1e+BwicZgC19׳+pKEINx\ox%]mk9څ >@AH$A",'r'Z6r-PwXjIRI{{YJ#($cy#] lOdMFMՀMnFDl&V v/Ao4 *"BjLAu7#01Q" !ɡw~ +5i9t (" #`+POi@{/f}۶G*oP$⫸_6DCyK?pׅB!R!}dT1 -M+! n";@!j)R ATVj$*f)PY(M-lcըBtrqtip+諾%%;sldg.$P)Ϝ0Y>Rbry?^*:dH)-|aV2<ᜲ,垳@e(f?y6h[kAqQFfpvOelqeRr445/,9};x~8փi)H2tpg} QhS3Vjia*lN.է+$R]DN cI*_\t苻G7l3w%A֏9r-itPQ+Q' OfwTA~w ]}ps;{fsg'9޶"WD|9FHb2A]nҢ)ߙwD)ޟ_pnUnxBsڻ/~撒4@wUN2~lEILJYBEB1&0 ` XcHDm2AqŹ >wBv r]vV@&n6>{?UP‘*(EH-n6]Kgȱ|0&;8 ~sA*bk9Xz Bq)`է2/nRƷ 0wx{^Hw<6uo \@oGXTvW4TතLTxm-s|M^($ nSrxMXQ3ְ 1FinОz 5UnCvH`0fS1B!D n~cIܯtG@Ĕ/HA6]l׸r B7 %<pDJex.Skrʧo26)'ł#av>L6w(w38c"P06 d UCޡ$AA^ι%5O*4m@qQ44_n5B Aݮk]ܵk]-_[Ec/yK yx3.cDQDt'uJ_2E`ݩڗ K+ATUbfSТrr9e8a(6!PɱTqs-RzC9'~zwr.X`EɈ'(4\ vkw*=FY je"cX!cgHUalvLE>2ux4;Sr*D.~__o[l*wooo nľG-yw߆Ujuy*8a/ =H4fWVN/KTՋ?,*O ^zzȩm#j4ʠw2)f-I?ZUƨS*4x@lN賣T,Z[5EáR@Ϭ >0n>̕} jNW;@ں.E*eN mdHT)3JSETCXc.;'Z@w ȂއբڢmZqt"!zE8Fry$j+YЉҥefX( C< >Y U/Qh`)^ [AED`uN>a%XN`p/&jERg`jkXW, Zk! &y59d9 bЕLV PT(&R)ZX1 a~;}Y@@= c/ϣW!FcRC`)8oiϭ:VgeyyoZn\%Ԇ7:0Z}tWW I8+ ,B4X;_lΐZuB!LE Ru :"Ցib`E ONā aƨDl3lcF !7s!jJ&@A+Ƈc 1S ([MwdOz6n6w*Fy[ .Ir##kaaEA\q+@kip[u+fYcCm`Fgƚw`EQ&[>^'9T$PcZѱ`?)GGc46j;)]۴Ϣ͉h {5`Dchzr :ڼZϸu9ۜrr/%YN`ڽV/Co_^pd w)1Lw8jXpߝ*379#fPMȩM#}VYvPF&'VV;5?àU_Ki!N漄?^W*쳳ٶ#F]jz'jZƨ'yrJmWQ:S0௺2^B;RU/9(5/*z`3*d y&F@qmc0c2ۼmNgMD)e@U})Խ${*5YW 笤SI1Bc;DD432EW4 WJޖ#2G"WO1+TPvwlIS]+=5VKhs:։u9MiL&cjW^_%}MeqLZ֥0g&puvBL{$W Tĸ70ʻS4 ا| '{Ƀ8y7>Y*!^U^ ԥQ `ICjW GhY턿-5ؑ s&#d)xr]_rMY:qd$|FL*#}::x"GXttӹR-'EmCa 'EkؾCeZœsCUL{)SΦ,hYtre*5XG^ Ű~( 5~-"Vr(b%->jÝȘ?z?\_+%-}ҳ^NCC]X Jw]%qF%$oSBlStЏS": R'7Dt: V93Gt67v븣L^G]r {W̘J$ }8Og/Q-^bXcCsŇRg(o';z}Y;9n'VGtK8kw . py摸辂2?˜  s8גMyú`ٝCBfY<$hɧy Qd{c<:^\1ʸd'("֠翗W,"R{quROu=._}#? دeLw_>vkW/Nޏ狼_h&c\KY m.4 ]/Z  [w-;Zw=GjYoNC )$sЗR.Sb*ߤd2_?Ny cdflXyc1׉|:y{`9À\'aPzejG{2}W2 |3g_b =>dY ʺ4B M]}DftM0{;FyO$`pƮ:AP^lZYs0vn2oi;42sbY iA쵱$#qo<9R Gsۘ\=b$`7c1h0ZL5kѐ.DMm"L&sYhvL}v-#h4OF;{^[I', 9q񺶡v| QS.S6d[ cM֒X|xT*g2vvL}$ڱq'$w+믿1//q9ۏUɑ< ^O 7˟^,*{t[$hX@{xyQgHǃF:HHߧRcE:Pb^a^iy|`")1޾V9cZe0hxk.HSƃ l|2]ztһCkm> L)!zR3\R4[BAZdOƹ TAֺD!Ĉe7֑p\64zFl`>cH!.6<@`\V2±?-E=3loxdJVZu{9hP|n(,X}Pz*rу:i !MQ gܬڴ ͦ*&Q$6V~ Sܞ~vsHV)V9gQYo3ppV/[ЦE! )FdoWqU)$ \; F q#d?d$(Zf2M4 #RxC8;,-"M[jZl~dPcse+j[u$+m[O~ Z9mA'0C.6q>6oAjrmQ}-;zYΪ^wV ΎS+NB풾ꪼ*Ǟ/i0*^xAWRbfS]BX ;6kڥ䛌,9zw'igW:}ˌʘ'5VԍpɘE0c1m&s\ i5BcBҍS?~&,# idAlFYZfUƗٞg{zf{b#'ɸaB[F4XLr9*6DF)n cY \D!wf>T@6p(& )VnvMFӇR>l,osxWy.~烠xON޿w?&U rP'G}(~[vxv\t`D챩np[+ ȁh=޷Z{@NsrzmVNNB҇呧{!ȏ:w>uw0u9˺/b6~wg,~ųrnvPey\R'mk%+Zk;W4Hݚ\kT`bLSN඀U$c\ mxFdnh5i[ޅV!Fk@CU 4[-0.߭'w1 ϬX{]|4&-?rlԨ] Zw9D^3Y0+R8a隆JRJ-Ѵr2yaٌtoEعGEM@l59.Aj$ ɉrM֞)b'[˾2͹n \ t`Z?ι2ca`)|Kt 2CG*p8ڌʄhjb hl/T z 'mq3R)(TG|\q1"X ds20RXk@00g2QqƐ3ΙʧTm[;)+bU/U^[< m$F3 +CNԆc^]kGe5YS0E[!+5;.}s-a]ժVu꭫հuɆf[wȓEh96תhR30X~ DɆWShOzv]PAWQ8]7_B\MXjɈb%+ 4*Vf>*,D: rS/*8a c/(M)O(SXv.VK``fm{[B^Z}'ɔdܻ#aUT?z!aQ+L?@(Y^dIJ%Ql]d/32y O-]k[&`]Y&G5+;My˖B pi7R- qPztvZ_iV;v=NZu>lZ5}f\߭Y=w7K4P-lfjWWeI>ؖQIGAY D[z P &M"VS^07LgNطpN[ֱg\,1Dm>bQL Иr-yْE,Uoɢ-Y8éML&8CqE~oo1&ekՋ^^ZjXzd"Yõ#"UųMDIĖ2J 4.r?zhE t[6hOՉ9@-='SkL,8Mdhý8;/LJ'61]^nITN=(ސ;xCͤaP.APnTMx+U6*9J'؏_#pk4x &`?gáh`XT)*o|>.\v>lU9]@ZfJ+֋( jQq|e2X} ]SMnr&̆H6Ls5lGg10gp`n̩_|!QHbQʷt0vQ\ TKX{9 D=Bۇ& xP/h?h o eSi \6e?fzp{Lx;џ ceÖ:Ay@/Hߗp{j^,P~'hA{dvv3Z9wR.{"?{ aP'Fض. /~qd5%YPo7y?}.|σ#Y//;NΗɱtx(|jo;d)/"}v1 nÆ {# +Kߠm{.Ÿ鞞D#{k1tCj.lR0b*G8f2{"#猐leWeh Z0PP8S%E*آX$33Йn2oC:uʘA]u^^ӸNcZY\֮$]N&Cܬb߯iT P@-p: x[{4,MR11X6Un* ].y e@I#=j)MZXXf vaY@]YoIr+l%, /w{SDRRwd*vW7y@8dȸ3x|MaYWm9>e7> (vԶ:gz3\TfB X4\ 9p6Ά!Wx3@I>nНRߤ;׼ εs 2[n\-mm()]w:Àз4ffoeɑy#3;iQR7k^αar"A&ncs3[;!UϷߟ{FȈ=1 6 SS3PM'[w6D1Z9sm:ZGųJ93vkks/^2bnec[@9"Fׁ1gKfi`SD6(<ƽKsR]ڧpʢG0QNj5L{=Hv{|QJ7L>* =d! ^#=t}9Չq\s)}J.KOu#rw9RgTbarR5*z]43Ø`&&3m%'R )ha. MV݂lTiՏ0m, st>qqSWÕbe$l]0M%lhԼmTq,0UۣCfls]6C\R3lkj u _j>VN]t ZT6y- 6Uꒉ5a5pg $g0 wkդ$lVJ-!3" yvNxK1Q !ZqDg?x7o< !{>*0[0(Ej^;xc.osyxS5x jK^)7$ZH[swCE;fC[hpu׿zYs3xbtgEİ],^~>Kc<6*E8r`j/{)`])=d/D4vn@WQo:`r.1$ϴrԴ\֡ )q>j_Q>Plװ57rAd 7H$h EI;^ z"^*se t1.ٽ*]Y Ʊ֜RZ봧eLkϛ5/D{6ȁ.kݶgAD=5F@>M7t23 cMF5nzg=Oqhkfv.䉫bj9G~vfaPϦ37 i5v(;&;LOa!`AŲb봛aodX;=YpG,/?t)Gr]{?ar߅; b:=b0/2^!@݋Lgp ׋>VB,}g,pu* /|=n[g\}2\($yֻ1Ӣ5q-<'-^܋Cu0/M.{Ubzi:s4p|ؾqzה(R9vE4Ƿq}SF98T#B8GZZPq̀RH5GȠ|CnLJrvOj9.]Wǖ2G:pZݮ"WDXNſ-ve1kv_._们P>j-'NDBDڇfëï7왢v j=EN;f/XASX:kHAWkխ>ڧ=v)!{￟(|SO ; S >8\bOTOO); D/K>ʉ2.Fp~E\YGJxqBM',_y3 8JmD*p{T+QsR׮<{xw#axgV\_s\7(ߴ4?4 1;ej{k8G8($GǜZ D6rY_lVeiF8eC[DB%O?MQ~xV:.Vt'C 76h4r$$4@;(DZD"K(?VhS4ľOExwOŢ}boX]s?,s6BX;JlmcWlqZstc'$NT)e8(d=Rh 2jcc G(Maro{-P[*}VkmE;~1 N Ai䐶bYRtIg )YIYHbMږYYY.jEUY*gS41nl9ȁ+H)bY1Iu?MV܍ ;̺ `$__.çTb()J8qWəgK~^nP䍥v`8)t/+\%%i^e>؄k83y0}WAl13M@X13ѧ&a&&q⑕VGLU$F4e&j] 7>&s8LEJ%c$]]{NvqiͲrRPH."UJk( Vd1!r2s!3t (Hc]ξw.&.li)ki! ,-jv|#A!6&Y\1 B' UfI${Jaʙ$ ҇+Nɺ#eUrawCRxQu.A娃b{) n.Èr7 廞q\c(J7I*^=3<%bk悉ջW幠 ]89|z(r SBpWlNLte\NB..łM3| _0iV7DRs #Ox#HxE bP^% Dp8 Ҩӈxҥ$RJ-W Y8_fEh`,Ԋw> *YvDLqܝ|JRҎX#]P%'%mFsI;{+3wu2)ql k]S8K69m;L1<".Zq>P<C17yj:y|tg:i~6? o{<\} ln'B >6co|R'ܕm&Zûjk-vFm|lYў7S|m{m_ fxA ΤGh$C(ʹͷB墝>}_1"mxٚwraoz_'g<ԶokcQklVMImXɼA#&wk@“W.g_Yrg׏:b%d. *;.٥A88gv; 2@ڥ-Tu#URH.R BM;GB0+9xk.8Zц)E(P,0%\0s!$顔T^aX` Sހ2_q|3L$v<5Qvy,LWaDdfߞ@'=277S뿏Zy(HL6Z_89j%11H,q^0A/Z~O4k; :; `px1DbL@)!(*T>„# C6D2p ,|Rslsyj%bmD|1ӵꋒ4v{2n VʓA]SprAC< 2?8-Awޓ rTN%WΩ%#hq;齙J߲n)e_?s_RUIFpKLn!j_<2q섓6 /y2V%B*$C< [4dE|ĸ/&`AhшRAKu HJq_D $#z8 B0CC[yN쁉ᴍ= f<^^oٽUa9s0^`T(FLdUF^\yѧ$uJ ]iݷOG &} Rk,,c(I|Ǡ'"Q 4<ԘF }aE^ T}HlgDVv)o[[fce؈&@sXLCc&E t32,A*&\T$r 7ȷ5 o CCO|'A RP1 )CPzbl D5Sq.b@Mb` /T2dgDZ!ևKM)z((oVVOIJpϋe`vu3E7x*GqY3o4Ja@ k ξ:%A\J"GH;y ω>A&-GH SR q0pWy3ݛa+ [j{%$z,q4B&UDKQvBG|ҪՒTzWK^NǾgl/QxۈTK/4LjMg] Ӻc& tBuW3H6lgnG4)Bl$4#@ԑc[mJѵ,Y %7u~9٤TLk;ר'M k:S~U߫tJΛvTi+Qm޽,1sٵCAz򺆃oZ!i$SۢMyMۓ.ykۦڶimگmڳ7 `]rPD(b26XsjHb)%A!p"˂ª/t˷ ^mMQP r9E>H:닓js*EͿҲ]4eh~EL'񭅄I~}P}OnC}0 O^?,zeHQ-{նVΛ )q%*.CkHdҽ?y83f/o卙Rt;^TG/`ju_ ȮxesS.8Yͪp4m:DւDWLi.Ip+"Q5CE.T([込\T(&`k.FR0BR*p,*`W5[min>hl8aLw{zW\ CxI Qz|#EpQSX!\=5 e.C!iUj!$NZd(]0 g]~.muZk9&_G_>=3Bk  6sro@(f|z@q!s56 ۳ "DB.(`_T $dM 5tWKvu`&B4`A8a ܰ*jrbm!cWͅEW^Ǐ?]B^whIuumAt۝Nz㖖U!SS&eGkOwWZoNR1  u VZasE$cIbY3Yz9S|y#U@Άyyh.&ny8{a GݸRqZ摄m {tILڵ~Ě(T܎<78cLk&>ȯNO&C:Q9Nږigy7F{!n:ZtrÇO%7O#Ny b/߽OE}N&Wǃm3yCWq4k/}wn#73>4_y|Ѿw}zS|m67Aag!0x)JۋeEEK4W\gٞw13i$=$1~Fzύ/# x.5}g g; 9}޶KEmX 7z5Q#Y/ }AdD_%刴j7=CR`"_UץjjI) 2B$.\sI[ *(p yԒ׊v}֙ug ?%OW墘wY20BH6-IT@IЀHZI@m)b@vB⻥;;Tq7("Uh=wXpRvUѬa#[,5lt_\v6➄" hSHh*CSd&xo`$@Rzb#폍B٨ب*61 5^ۯF!x+j7z;ТA5?9 VvHF ׌B bd,R6Ё85:"չg݈qR^%3-NL]rR vOj4+HR{};亞ODĽ rBMlaqQ[t$~P_"D:DeXQj S1*KN Pn6L*'4Ȫ]sQ8Q.jجr7fDLL \Jk0GkNߟ$pRvOIj4PgnWxd*+3ԣvZƄ .҄Ԡq$50 z.KQs3};T0XT0.o sWݗ@sV;toO] D@FlgΥ(뼣aӌ`ly97O!׆w;m# MZG,,T!VcdtPSГӑ2j*'9Ce3wnHAd+'Ѕ"Jlˣe֝az{g~}n)L`lw_ pN[[9m@MKM^:@!=->B\_ULfGK%EnugrYC,ǘYJK9C[broX˟O0{^^uo <Zyf͌\!Syg @9Ϗ'̻bnU86o3HgTciC|hLsN+J%vP!8;\FO~gL[2%;DZ@eQC!J6V6̤嬲༨O򙿾CBj$L %8hc0pPyvԩ,_vuehUYN|`>;t:eM1=Y)#񤘾<)=d"\KK =$J߲~OE c+\;'ȃדapVըCWrODDQNDDQ;]Ɂ;ͽf RXyÂpQ)˸ a13sRIMŕjP;L%0X = ɌKd>:ڼo_73/]Y\ Moe^bw2TpIe|eΔ# v_H.ՠBG%\~Wl}KESRR !4G%TI`4ח@}tv8/!і1=G,"vDhh%VJSG v3-!A֢e,FCLޢFgJv@˺Y^ZS`BE%k`5{%?SsBC ;O5w 6ZE *|NJ+ IFctZf)DX7Y  D(>I* Fmb"P(\$O}t {dM*$*<\W*_`:hK3%`O=O&zHf'qqXb/S*jPkzOf+b,Aˀf+E}BѐbDM ,R)I-aVe`ZBeH:X)3pdRKh O(ڢ#9J'iJ\JMf5ʠsFr1{@be31hUC.*5cKm@Cnon9Jf)zsvg [M%!Zsf 22\y:G F&b_rPcLN jpjB ER]U`U%DO}_(#{|tԾGھg%n8ѐ ]٤s cN<\bC-f.W;P}wY,[FKTH Nٞ2OE#| DPz,i/T*~8XN {`?L VN;,@)`Q].JQۨ|֔K@w|q5cV*os&RINqI 9u~=>޽C7g* NdLsɋV(mVXP|Z)aW].G i8{cCj1Ʀ2 iF`oQdR,\ًdz'XE%:إlDa-H/jQ=Զ;b9hrr,4B#tpprT=vuЈ;xKzz&‡ǍiJh%',|.;!GNBck,DÄf<(,_/ԢRɘ;-WNPwv20tH~Iuh8Vf"'կTɡivvFу#,,O\Z>4WF*up*.ȸaZ ~:$)I ?p(!&xGxGvI(ť1EM4\x[錵Il+Rd'TQ;I^ _-$LC((HejskfAy2!&0ILN{;O\ZTݓU6jB)spvVmV HF{ًEW^5[I) (K}pNd5D(ME'wEDx~i-\UOddPA>uDi\&EpWGet(oNHDxT $0̯nWjnH;12 \ŦD^DY/ - )d Ϊ6A˗+6[,^lR*V1Ta\n}-@f9{j]q'ofQ5زWwM칿H)R 6#s#>VCwoٸbVAӽa˽(^w/W݋{QinxY(ibuc4 Բ xNݠZo IS"~Ʉ8'?Eɴ]I/<MI=dӉ]hrN⫘&9ܿ}! i e/{?ӂR"1;֢J#av-ӌKV +Vjġc7ʹ7jcAGp3 (hA)$aA1ځbڵ\f03Gv_({@ RAalCf!'FMogKh+ڻoo?7-AXq>9qڄ[[sm8?xu~=~aݯy6W|rw ")CӬN|nB>G >udrt!B'XC܎ƷaVb |斏?!{GVD/ER\ٜ"x9^r$^k;`Dvi^vH GC"sg 1{㶑fp>xsΆF+dQc{wjRJ#4XdW)#%ޓ*dmHJHΌ/WjYj5v Cxw.!KjV\K;/xij,*|C[mB%㠐Mk{?&+9@K4B|%^]>_SB]"RrPč)IR|׶Duog -8U%1IbSZhj^_)FJ u4z[rIgw'JP"p`c0K) Ra 7+jPaDCCHq,!

-֊@֊xV3AO_Z(Yųb\AxkӠ5ճӠ沙 1FXsbIda$-Ř#鰞LD#?8-X 9EV>\F2GaKHY_ͅ9Q6AU`UQ͙~ 6nXmw_v?w}%O/^?zC._{;cXwN\`;j1A 11i8OIVˎΆd2GD&frIr b5 BHXT8hR 6uNSGB DUs#)w\;Oheea6–a1c'&Nȗ4&1'fG FFf:M2BHDAkR)hDt;#ahKScaЛ{F Gິ?Vg_"}lؽ/S1ٜZX{fM]VQGj U-z#cL1`pށfXR/V=t OucZ.٦Z?|5(´a}"cA$DW.~|k/n|NߗD qWO܌Apec_aVV/9Y_uL@_ɲ fX=@i7*\4@9 nz(M=()1P!u)-"p|vSƆr549?/T Vm"U[^Y.8ZQT6y}]d0nd{v4Ys>+\i?dE.wC+և5ڭZ;K"÷ӂؼrNۋ?YdG>$>Ngy06o>+ړ06b#5,2n?-nO(kZ"/zL%f:Ooi&w׊jAR6Qn+`Q4sl>o~* M $2ox_(Y>W,EgC/)gyRj@}ƧO̬?ū^L&{4Qwep3Xf8s^n;[l[-g/dUk~?a(blΏu$Z%&ҶwD[2`Gդ, CH6Q(֊ p}DyWla!=B+:]yz!rHb (q WGmSFZYY =R@$3%ք&VXpKM1Ŕ Z*|Gfj,(W۹,W BїGa b}^mnY{$01$_Ibk ۄ8?>l9~$w}/o!fwmͲݽzTī[B˶A/h4L#Bߥxy 5RTp}d_;} 'k_37g lMho ǀK.b2%_Ȧĸ9E?4l2;=;k%:Ҿr.4E TbA>GF 485.smBsߦݙ EYEs}%:q gFf}PP[8͂{YG؋ߟ H˒ M.r}j?PrgRkf`MmXNĝ=]qa&.N7y"ƛl Vr`ˎY->羬-'Ž:,\K>r3֟ddkG4A)$&;,{?gF򿵣/+O|NyȠ aI ~q?6هȞM6EILǑe0+ _]I_ynꟿ3΢h9mj8_ߔ`n]>2+y!rA~-Αގrνxt3ϮyܟmE_'-p ᵃԌ2W|~bw ;țwm[;"nZq&nZn~[s*s!|Bi蕯c߹h3um)$ՙ/9-ҏQ$/(y_.ўw f V)ү3>㰡QLVDpP+:Dӵd-U_WqvbK"B^le-ֻD X WF=?/Lw].!ߨZC rNJH$IG+Ne}Mrر9LqXb,;ʵ qߏҘ(E(]B(1LS!nmVb0-~[Q^ot Zф`dH$c=%éaXӔL,QH q4FIQ,|՚ '6qlw2Uz&iz6. ݻw*4| (ɹn7uUNHܥ{/侒/tX+(… #K Q_=X<%5rIѤ!d*^~ Ab;YtHMPcG v#.ՔS{-yyߙ;uC% a`8NA)8R $>6WUTAK2ӺVq\hݹ`ݍUTDT⎪UT*Un 6^ `<ܒEQݍ ZIi!KoEZފkxrN #B!ҿW7[eB߻2^[im{[e½G;N˭1+7䨁M_kÅ@e f?u3eË~xZvT T1?IAX9ģ0a/JeMaBhKkkk#W|{z<߬3lj2zXbVN'CIЙET ,l3QZp>tYweVй{D.Vaoy}>y_t%˫ſy;-_M}wo&Kߊ|ERjuh?r)ov¿l]LM(}}$>@g/_ T ?(pIAWSߏz,88#dGL&9KK>a!g<&WzY3*}1igsho4DႷϦgɬ|S:(6Ejx[gE6fn>C:!0i#1bHLHBU)m+d{LX g2^ו5Dv+.RQq9<'DV>.ֳ#t\6'TF,v$bGD?{W㶑b 6>,l'6b91f3դJ"g"eM쮪UUwULp AE2kFܐ*N;3U`nl4>8m#'Z"M̆cB`&ɣ7vQFJRoB)ݕj}{'tyA4gkf&a 3uH+aG n`ʻ4c}/(rq~!9VtC=!7$)ɟCoH Ƅߚ6ƒА,Vwh@Ŗ*֜1L }̯ پY/9K",#@oX;RaccP6(ҽ~7E(ձˍh VacD86ј*, )& H L+qH>,g%Ce7) ;C\\@Px lŽw=|yWH!V:xWwu@$bU~78W$ Ck,fr$`-N3kp`fL!;K$".X#0Rb@a" Hr@I{$9 IA3=6fI屄j[elpƍŌDa6 0$YNpt`THop+:J! !ݗDЀ! prqqAZˌ$YPFc9sjp !HbA YaYgL`YJGTy/3R]LfG7 kvgw1Eǟ&Xn既8;7 !C7=UX+b`E߹6 .\c ͒ό~ /hW5DM\F) V.-P2M oҙ)'$w(ب Ɯ0']_)mX:֬ozWWVCJ,ib ǩb&9 aJfQLuFETN1X6Ċ`NkQJӕ%~~#R.fewN|_'?89 {Po&kO$H[_HuU_ZN֚\r>7|\͇jn M0Q,ȌLBt XD+Nvnt>IPP yqydMwzk=zȼ7Pg'o )*Lg_.U3Bw=M9=V(;3$ުLpL*K@&`N2$i9ɔ '.1 V9c2M$g10aN2:*:= 35 #+j)FR˨C0cI(T 5*2N8mfsBJ&؊L Ƶ.L Y¦XǍ@I:g|KR3:36Fqi92\^׼3guEf6:re>V7?2ńՄ-$/_TG2H+xT0;@" Fx2gVU6v N+3ߌ߹ [$,%'YFg~=)ㅿ?sNgjAزX1 ݥX:!´kꅨ:"\9l!E'pMDt[!Zq3% fꐩaq!%AN‹c}xc3QYm7 RfNDň`?}{˟L4+G&٩OwC 6 8h+3'n.b=FZW{%1鰷0i6 ѷ %GfM}>O';ܥIɘ%f tK nڻE/7 +>>*nad7n)~h*7TI90&×Fvm ;6f,$ kX?z̬C]PgJvn^\\w歼!a,WzRD S\fS󍓤&KWa1`\Gᓕ#ǻ0DUN$Hz~^//<`<ۛ4t:מkr:;\ 嬆7K>6pQbB2ߡ(̨~c%:t2ΊW63Y"!],1K5<άnDHV]C^QO-lޝj@eךG?Gm b\}0x&xx: 8~NJOS>Zf61IVyҒ-OnU n{TtD;a/郝DmJ{gݗ睒w'ɓwFQ V?} PJvĘeY\Y;mlgi:GJ,BF7i]}r0N;`T IE!2I%[g(c̙cd!%_E ?|Q$_B |5JRʥ$IҚ KxnPNqZX+# aa+լ a.S "J) )D0F)eF(&|V$KHJ'ˍ.ðd+/2o1)6Oˌ2,/%"X"m l>މ~Te06{`Aelv ߊd I[.$PTkҴ>X/;֊c$jUV'@֪\}<ݯ0ki٧,LJ_.!r*W;:6H,Xlug^7 jkzʎi@0뫎T(:"/cV 4Rp+:"*dy\p!:h)r\5B6 .v{aVa6?xhlp墛Y1ogXkmzQnZ;;ʙtZ|~q;o80ka8ʫQ ^U>_]_s^^x~EZmU⒞$ѭe!%c]GiH1d42]!Ik̘fi\ < ckY/~[}S}aMW~_~d$<Ұo|W!0 +w`EIB((p=(5f vJ*fop Ax.cvA. ~r"S[΢r-4!] 2r#X-)+t\{&dJ2ѱ؍'/oy|gg!x,7+i5Z0zzG*I&yq%R;1-aYؙ-uTii3T/waZ jA.8cŨm~h?1/^hH5*m_:/WZϯRYJo'w߽FlW->>{Sw،no[ߴ#!SGqYLZ,)9f5b|uHưi0$R&d̀o' y &8;qtaqpH*54D$ ڋJ"*X)nAGm%uGѴ!r=["Xz 6ֲ T Nz-茆0F!m·|b!>bJsF)4"S8 :@PݲWF5d ~sY%m RlaFf\2 &$8x".uLq *x<ާni*6Sj@@D ѣ U #:kƣGHLUٻ'" 6XAȵ6֨R %;"-v5f{quKs!/'&,<2wc!ϨÆo[\Ž_t{"/GBa0d: Z`FOU{u]0@0p.#*qXJ8- [c984fı 1k-A!cvݵ"$(v^])~4]'Jj%7ss{F$*,A 71s1S!DHĊ:JXٌQS.=SEguDBV+B) dNo{8T} 1 ;<صjc4y n"۷Z&a2S%5ZGu\}WF,)Uv;"KNFXkIk1 GY2?@HNd+DٍfOo;k:#7r{aP% YUX`Q6+bד$ RprD;mM؀&%)3X95qhqNEӝbՊ>8kRwƵ<[L6{\/b4׌kMځhSHQ.U3֣'™(fnI5th "&O4` J>GtW@(t7W BQ#Tm#D_y8 [#-@+[C:Ba.W7t:~Oߑ VpcO HbHu kzMu.rerD5)R`hJmiiz~&+40,jU2FgUwvmPXC+-XٴfF'[nHWzTYY#:<1ޝyr0D'7 Fw Mcl@wQylF+Qa{i0,'vچPԳ{j7=~q,4F'Wc7IS:>c A2HcO)W;ZpJ.t/׭~ %@ е[ ~[w? Fm9׏@`\aoOLvDpq:(yղDB{25nV9`%]Uk9` N0;WlV:(^ՃU^_Ç+B;l;*cˊyڣSD Nx][uHd ^:AWayqYzdt lЙY4c~LiFZPNtwT`-wy֝NG4H#a_1ԉ~bB9n1)_7w#ަnЯeOdBUxFa[u6{}Chw_w}Nn;hO?@~~_&{n@^}!d ?I/`+ofpN0?Թc?܃А;|߱C$Le9 z曏c-XyIS2$Q)}?IZ{2n\:E ]]N8SFǜ>[Q2%68ZqLVNu^z^ǟfub9䆇vQ2)C7L+k't2֭f]>$D'F=r#1s΂kU%dHcehHo}~NK}t#JG'ҋ;yPXqlhr$1L0 p!Ic!)k}ܸ)!_Jq'^bMCq{9+9+z&zzF(0*zO-HIt*`X7$&k UN0.2t_GFUSB Yl7<~Ia*6 ibvbY5ȄN"l c|RJA\nøU6)M΂Ĉ:zL,!-U9 ȲkRY&'jNy(s/XZ#V~IBUyq"{4qp$7Nxm ='toU ܮ_J|_QEwq{vqq>/؀fǻy _y _×D#BQJ>6D6|*cXUfɨStF賿Uh~jwg/WV}wgP []7yӼ"L_Zb%ckjt vw1 +zT>ՐrĮ9YƷI3v;YWU ӠIg<dHl<ϟ1~=Y,W1_ -}>&&9?AO[U ߲7W?u ؇Oo^5;e%D Ͷvin潿27WK|Ǐz-gcxo];`QA?v~74n>9MVrťsBц< 1_7,=~>*򙑮.ܵY\]\y_~֢ZSCjs@~?|' #&o]&ŧfxY()HovRXS«Z{PBBBUHzQD>AM4e2q:hml3K:xPZ}(DaD@{KЮ1)ej%` `:K=;.,98n:`la#40N-P,,xJfiVYGH|"*Fi``PN)GBd{ɰ]PduVGLF(&بcܟ=\ۆLg|!d}6E$s[.gˋ~F9|ρcHk(hk$?V@PIԞ1Y 4% C_>|Y!,ϩTg vlEb(69.8$m= y)l]b࿣,U\ TZY|vTp,@9=yM%`S4Ah/ \t1ly'jQ7 ;ÌEhcT A0q(wywѬ]p^lR8sh] >ΐy|33|`-E\c#g%8ؕ4›DV7*:_dFӆػ>bCelPL+[.{#siڞ5 +(lrW %&/ uF)H8%]&C[tt}v5Ԝ^[ZH< J2t Pq= m6{ޟBvS0$J֚۟k>5p[]o_`)k:+gWV[#KPt#v8yj{v9p#ep87]8Z}պћ5%rJmϧc֌>Y3D[$9L+ _^Zh|&bיud*]4${7g"/(EֱZaP fy*L1*c66T4;TR# !nrn + QA;JL26J4 lIXΥ 4N(F.vXIE':_JI#돻ɰw=n=UZT^aS˹vOrAADȑ-vd Cx!@i B&svԁ 3nO;9^"v{Ѿ>3 )W\am_:tN[b6 gP|Jj,Y ׁ( 6LZ15ݭ(nፓ~yaEo.!ay`wLk&8qS9>:lr1A֎ ѻL1dJe~oc:Б& `cLᮉiG;\~, Ō߻tQ}X͔<-Bkh4P >gܵ}w˵rpJOҡ{wNm]0MwXAbTRZ:~׽VB쀹YIrv\p3*{AH:َwܕΕ$m(zH@!KB c0*bNI|c[&9O:AG֫އ. 7e1‚f'uW֝p? *½++gd`r.2"cـ-$\B"IRePwe }Ѡ]YC_7CRCb;'I 9@Hr[V̧W'?yف` ;gx:c`l8ިzOֹᰣRY[tǁlY)/SG>5H,pvlF;*  N ; `NQϖnem@OrO0Ei af/GXJeHR+0ΐ^/FCRD!+*;֡QLrùNبktse^yaqy |A:-\bRywLimZ?LYy4[`;ӵ.M;1t{yK,XΜ'@'HnZ@U}8OQDZor5qywf"بQ9O9u7NA IĄt^x x_<:AVdsFG$D %MPbV]6`=|ޱg۵vd{h]^,6EhuJ{\\ή{zp-n|#Ǻ4FMIJY_]{.zhe Es1w?{WƍAnގ4agv O l6{F$'-Ժ,ݖ83#EW,:^A(- UA`W E5FX\[[kJmrO>z@ 96, acJ U#6ۊjoN\pW.o-ƻ97NdPo|9ZuHt-wgNsoT;&F/:kNw@N$N:p љȐʣg&sgN f<[|g}vd^ʙ:" `VhD[dj6ؼ>$}}.|!?69YJx\xH[e欧M_;Zw~h8?AqjG5Z' eUUȟDӵx|BtÕRv-eoo8s?g$rǿt%)4-3Iױݽ4t}O 2DgHkuI˜ZLʘBbE2z,zx"-cV`L?սefa""rD}Ig)rDėp(A`;!M_xHuDCE4gh@ b#&oƂT l[@3b<ĩsm]rηflfe&<ߨA2ИHh W<"vXP,h% NB5퉘a@91@j % $(C`0"ILA`%I D͉x*^<%3Ӳaw[!.x)ɜ<ߡ83jrF5:O߸nھ(:ikT?u߾\f:yh;pj[$/ڣU=nNg{̽]>>ɵ]尒ԣY\$!_)[Rh7SQnEy#:EȢ$D{^Uj„j2$ B2D-CF%mv+)GGB-Qv+Lv+CBpݓ))S/$\Ux߼;/޼7Vd4DHX( 'Q#V&0ZI]惧)ZABצ"Yl؊`Q'$=l} Q!,"%1tkET!.&FSHvC#߲>aR~ h2SE Ҋ&i2+.”o&Bp0+>_ .hhUuOs:a:V[]K^{Xo.}qwcnm.fJl]7jRVDc_r;3}H,6E^b}6KZ$ȚFiPib5%5kymdАvͷvRy6O݊;n^̤S=\3S'sKOv;5v^`@h 1 J<~+U$h۔R9$d︦ӵϳ(+)#>D4y#zIy| *!Ѽg'Qyvvzf+ @,EE&ӟM&=\?!c?x}xxz 蠃LH,{D`qMX(UG}% +[E s3 7 =};j\yÛKKR\8=E]3QdGT{_W)dyofR,ݮZ&&C.Ǹ @.lEIju69ΜZ*eMh\ = 0-%P5Л9tmyE`ߔϹTULAp89’do)0fnvb J TZr@pB$*VˁjZ%r5*r*(}"-7~D*X*LfS oji?r\$")GCm6/[肑9AϾ|'N(?Ey݀Ӟrb cE)%)c:.[Q)48@5 $JIʠ-Ѡ.)eK,~NaXM SiK,tlVo}\u)1d JV{C_JzڟnĤY؊OqNg\H\rB q$!XSFBt  LeF^FrV+@sAaD5B *͹6ʻ)pHel2A0TZr ;~)3) 9+<{@*@|d!0+^?aL[\# )pQM9- -<Cq>"8m<5Z<}n۴=P!#-_اj1pc@L_էi8.FrdN&;%n3C;+CLňzeJ**2S:Ly\vȳhk}/;ԬӱjAVvu{OΏEt'9V_`P7%ThSVN:6MZ^8u{Oڝ,Ua$j.[Ɠœ3 7itthW*T, !< 6Z֔OOG}R\0-VG}{yďKՙ&ǝ(%n2KzEr>OQSrbI4V57ES#gP k,}8g> `UbƥQZBjh4H 2 Rp Byxcfn]Yphqa?J?PH.02 Gh-Uݗiܳc,4r+|U[ *B'4O$B.02E 6pl6QB:0qۏVuv3<ϏeMP(j!8@L5(,nc{M:iEZK#B4Up(T2 TQM(D m? @uWIȻQ3vO?KLt8YOp_v Zhc7  0( hDJDS*'s%' G64!CZ "J)(0 FDjEreċjZ*&d)"&l 0ZG&N')a`.rBHU<0s Б5#8bu 4(vR0xu0RĶ#ճC*o `Xl7#caؽ#=խ6E@@<[nafpK5,_ywoY9'c?k5GbJ'^PM57>q8ncp+G*c3#?5GJm0 F)AZQRv!Poz5ER8cF/h\n&<BM6L*HK畱DMi1%GP܂0}"JTDVGSu/ 1 ak+FHoBR .T1mFإ,\s3_υIHEH A+]HqP!1ӭl|s9n9 2Z@1c(8@66|x>QEE ׇU s"A TB/'4 Cz6O`pDjS &I0ƨuK5@WZ"l9 aYlŖj3lY͘/c;}HWqo4},M3BIiHg⚼̹"%[ wyz/On =ڽ꩝ qvgDwN>-ps~޷N>}o'Vd3˭ɮDTըг}P_>"a6?ss[n @83aJ,urT;4e͔ʐD*1Q(0 `X $.8,a 3;"pвB*d9d^P25IgyDYED;_O|e)Sg.D"Dw8b⿼hV85% AXFuDBw=F8ydN 2غ>՚;CHjsaT2'TrK)DhjE4+DϨ8kw7JiRR ,.G,gA9l(C2Q\F% ,qg(K4 ]ZbRb؆/'> m n6Fio*7{s)jT?al3c<' <Y!sәPR!zQd$G:BVɁO>,"c+hY2q>Nj ||kI=s~mC4JkE%X :.X=):,q }G~/1oߣ|ֆ|Y9YMe"Phf-+@DZmp0(  hCu)4ENōJ s_ $P`' :@f(?ΠZ]^ƣ2Xĩ~ʼnIwhMvPFM(:$Bw(%";bֱ,VXd9Z;T".]$.@uwByVLm}Bu.sٗq_$Q+1[#NvqQ8#gJĻظ0K=L. #g;* O/ǣBge~[v5|ٳA+:jBk nza d?[)%vr $C˘8;Wl80`NRi=\1v ӄcp _| YGs*;]`z,/pw﮾lv|at ?ov|wáŔʹ~t.?8zJ5ճo}<4(U;ߤSâ鑕+T8/'|uYU4r!՟KLk@E :(B KCpdMpцS6$V>=܀IP;=ވX 2=ވo50X~ `5η+zU5Y3vPǡḁ PЉQvaԫ}4G#tuR nIGk@Z*%b9I6`+oBJJ#-s1J(3A{í G_XOYtk]E*J)ZPn57[DNF )$jFҩIrLKQwMa(oޔ4*(:qŨ 4RH.ׇCJd*R~ 5򘇔 9?OA ( 84JVʈU{Fua(HDJ&ℼc *uWKZ! ۉ9 d;%+ l'~vډZwVn#ۉ0DpE9ImhXA !4$&8B KzpZ$4v;J"d d֕bxm]d_e֕Erº`ϑDhrz?i}a4{0wG>idy#C I)<裓\F!u x".oQO#{UWp/(9\jry.(m6:\?r+ݷIGN`Vt Ѭ>PKV9Xs"\8( }$E! & Xp;+[X#V*ֹ rC1(%p.ل 6G9l¤2w3gB眕-ӗ`/MWcsܖ7Jke.ڙs|D`'Kք`NL`Baq!u$^(hb/8~E\"VGC5pM̐,{-IHG?LSHuLSh:L˦~,P`lvgt_&9!:J-iI>p Q,;S8qsLB}y 7IJ6θ:2ԔA 2S]3FRɮW[pr09^Z\4NGBWnld,}'hAd4h{dAf4Whvhj#LMzj(HǮ>F'Ojx??Ocjru=^-rirN'g!K@o{Y1Gh c%!qi8×%q|btr1@kup:n *Y9Zh ؎.f;7x2 Dpz\%ΑnXRo?cMڛ`j%=eS$} =e)$*dF3qGLʊ|xT¢SJ[^`q06 g?bBPRBR}Ǜ>U xcPs"# JӈOU a #Xg%|Ͼ` RD448.08%)wY.5vs7 ɘ&LO"IDrO$[o |\@%7^jT$zwJ$ XK 6$u VKjJm!,9(yE!fdDE PL5/hJalYP8i# *e( J_N%{VfGgʚ6_azg=lH+_ny `Պ>U" U = 8̬9F*cjP$!1憟b0\XƗe5 zo7lu E0Jݢ~ab)VVWI9i6 .Uuq\|=K]o=H-QS9>煮ga'*V tFPuX'<jOg[!|^?*qU\~VK<"e98Ψ Da `g"NPԨeِBʇ4SXe 8ʘ9\4P$:N5j)Ae "Ptp曣 ' qlvQ{D"ɯŏr5`D1AuN<  @0#1bԶӤLe1)zTAj?ҸEYoogVs煉.pNϞCRzUw/<%=; bntv,- ܯ.xj Ҁsy)9ߓxOOQ@ 6~uE7n}#x^0.s1L@]3gIS@oIsVw.fĹJ5Kwl6&o7(9NEmn8[FcOo /i"騧az>zf>$}~B /h@7`fX|Gơk,7xK.n~ތlA`=fs#R5% @v+f/Ia8bX&WUk.AgݎN]#[-ZN[n4's0>&X2p~%} 3* ?O%@gPX`F$:'cLه E `8^_`$l zYjϊ2.Zy>QJ=/F@DzYu6۬vA* xl[%t,jwd֣JP }jO2vֲw,im,ThW Aqh0բ1_m|I^Œ8j E!avS5ʼn]ua9Txn7:N^: dQ:(i{Ȅ@<i{j7#y+[D>t A{% P2 Ǐ_Cf0rzS =j5sew-gڼ޳,y{Q畦F`@2/3Ed/K1);dgOO퉐.~.|)S,8@i?dyQ79Ug35ܩ_aPTgfa>ܰBݯ6`d\k$8WhdOJS Hby\u;72xNH &s];†aD8bV' 1!R?tA̡ 3!P84İлƈgO143#?ݤ\>b<8+??f^xVZػ{z(.9eϻwݹ`P?Vz~{B-vU T&͊׸O+^LƫxkA1kPP 7FwV[.5S*Ջl?@B-}^= .mduD Lz[)҇\$J¹aCEZW5[ Ά|Q%:P9wO&1'^=~(t n}uES|f$m{g ^C&D !$9er9`a!$솖a503 ?\ je rO(??Z09=H 9PH!ݬeh_MD*now!"=lR͗edX+>(XUC(rv[fU=L"qO(ȳAsu-pHk #q.5dz) 8H@9@g Ы]s?ҏ]~A 7coyH A"|A2mrF$8c\-S p Vۡ]yL6oA8 ng}1iJ8Ӂȕ'7y(5W;M  dk]軲qXCo5r@\ÒC^ T@L_UMa2ϫXl<\JkK59ޮBO= Q'2Bo &# Ͳ/DKn]خTsF^V$mϊ[ aX" ~?v@$JHc'adUFIlHEHƂȍӰV'x?vx!z,SΚռp0Pw@ܹv5{-iMiў8 fM8F/б3DI1w]9n+Q&:m2fr'cv|YK{(X4׵.)Rh+ۓS°m ۻg,.OS,u8զzlb0uq/xK'%.;݈~ PpROsf?A(~= q$*^ gmkD5|F$i?,o}Y( _qYu_AOǩ?%vrα| [Wt9qic`3#+M=ܥ^; хd0N=`n&Wǭ0%{6ؿH1I:ʁ5<ȏ0H۷A%]s+ TYd`kC4 .0/(8j^~"IOKՔϥKd 8lم@̖K_ˤ=tȉh㳚,,nZ!8"Z{1O>?!8{">.r!J)E=yŞ`Mv.,tB턟0e;ٞ8Ԉggċ\ kմOj-;eK$/t糙>^.gfϋy=jh6e;6Ki)ljIյ9{4ÇŲTCM VƪXf8ԵE͞D w\Vp~+j=E1o9@< 1x6Rdu>f 10aCE%y 1|#{ Xd\k$'K qnR5a 2_;%?vV#HGa/Y||gf^k'b'lB~Ev)=xv2^ϔ-_fy9g\_?IݮG݆ƅ~n^/'{̒ 6թ*+t Թ T?1=[&@{MN{:0`@arB-XxR9@ #gT-XU{ (Z zy>}5SK|FZ$LC!T$mS8e$|wjj=h7ɱgөzw;$O!$ !6+O/똴0> )9tp!3˕wmb.wؖn>hJsTPNoypi'Pt>/?Wu3ASJ# 8D0 1LBû=!ی"0&TO "IR<,$1N9Q,9 8ʼnPB~_>ҋ،,ImKE:j?uZe hqc;p鹢! ݯζps9&CTmlvҼ;^Ɇ(]g$gOI3% 2܎Ƶr_$T6-r>(9pd܍(vKAG .ks_@  :GeýG[ݬ IK}:[Eۇ\ߙ%E]eykAcc)6*-Fy0Bʚhz?Px1ЖCNt4H c^a*BUG~H(pfizw涎k&= Y{6RJX'j`ѦCsښցav:@Oe{jqj7cuNweS]<-7Q18ggzӹ6YsXd_NXNi``9)}8h<1N;!#q?,E jw-6@Kl5HK)M_jnQ;qEJP;oQ2)Z[N1MǸ5ٍ ީdn0S~8kA9@@BD;s;Mg7JCG(F%K, 0_<|Pmy򡤌&ݬ2}zEM|`ka vS/@`L6sMk2!StWo?J9$4>R@\\'ՌҶKS}+ݸ~ AĐibl%5pwqA?^E m&?p_W39@X zSZ^gg(qjw.O/juenz)MjȘȠƂ0Sqf(%Ng(e8]h^I/G*2JȸYtoo|˯OE|7i^4`}鞙xuq~󛇗^ҜRq$FWOLr?%' _zY2-4|0pk-L?;;5?O?x!#y7h[r$_~77& (T; 0(<\EF,O;D++;LHPݛvS1E-b3mtuaI[xNJbfG/$\-wLm*8`#zwj2-/{>l}<& ՚fZQzڗ.7kfW4"_[b0J: VH֢ᾲ}ǘPjM6_* TzQ¥ 6&܆{BeRO>R<x6&ʡ3Nkfc#Z3Uӿ*O~Ev[5mOc]1; (VLB%ףkQ+^SsdTk'ZjaqwzleGͣ߉YBxtG3}-7b'W̋zN|_?7Εq]qbHDQI77)UY`c,i]}4f7Ft㒰aWB<8eI,WѾc~*>xLJ|}uYBKP@8@E*kͭrPc)N`Tb#앛—@}z"p>f\f!jLrAczj >giz8a"-˔$nlr{q\?tuNY<pD۫K ߦG`_TBLNٽZMQ yR֌;NQL 9EAAM D>i Sha! /U0:#1._WEbn{m Pioq9$cJ:$etL+RH@ r&y3 dR"Jbjq()?wsy=Hc5fjIS!`PRef+ B)W+&ܖCQd*4*&跩JE\#RRa l0[Xu`-V_z9ثp!>Y<8U4.rz͂ZlVBTgZ"b ߏ0,y~ɝX_&Wlv[l_=,vbU,V_WtpXr1!s8{.@3A1 5C)fߖ.r=)])1~#wkqa>vÏ>J@D'T}0{-r%ST|p<3Ňc @7.Zn>ؾ+b 6WXmFߍauOK*$R2Y)sៀt{l !S)2:Vڜ) c:x|7td:\].;e/b@Xρa:<+W^<^5_<$|_¯U7ղ.xҐ ix_zEk05+f;|E,Rq9=9+74н|aBwߎQ92sK }DES9vVbNa% $ ]3:J,|;P5ܨ&xf`;0F\mr6&i^THU*!s̶b jhlT @EڑT<)ZkӔ~⿬1JAmerzovaxo>޾tK~BN>~h_6}"G&+f?hjÚooZG-L|,W_o.>sAﻛKhE[b $]Ű f;[w(ۓ3 o)LZғrlz{ảj䭼 VuW>w%5TfY|A#fa[+[Er,fu}:>GXrZ]{Z= |ur9aZ%TwLen~t=SߏK@G/s|[6}sm?G쨄/^CHKYZ$V,YH1Uuw;U|| 95yA2,5=lߨsvSr$ xZYXJq#ņX4Y1**xa4,hC*"e$6$YVMA?,NSU_bruqqs9_ހWǥ(JVΩ~dl8"Tk\hn/@1Q0)TJU)P@O$\dJ")V LD}KŤ;+֒k^z+*1Yù@\**Xc+< EawW@}T% ժ)\~/0,=V(eJ<3;"} 4Jslڸ hȼFg (+f#,m8b7F̄`NW&?;ԉ!+u5?rp @[3@Եt؀AO?#I5^XkG8 ̅v axu5fϥҔ\4s10ӕQW̰XHQ=վNnO@r0md8;بx7g/n=}b>n8|PقPH;H>4^7} ̒mA4O(^0>弤a}?t@/5 zKQw4 k^ @Aʜ_}-H}6 Go4z0Ģ9y aHN#0<݀9y)̤?js:^5{ ms hB͒NRjλX\Y0cҐk6=}.)* B`oKv){ L~[]CHǔWD)ƬE8 Qr0i^3Ҝ1eM74g/w1GFt8=u]dQƩoQdstzpOg7#p>=|:փ/w1G9ov6|ШkG|ke0\-+Nuw. &Zȿ'9((p O ΞJFS I#`DWzᠻ܅o ~S! Z2jpLǓCCI{b@F*}` Ɛ~TXdzX^i,[AzG4Wp4CٶDAcz"!z{#`-fe9MYYNSVӔ崙e_U!f4`q(|IMY4acRRM3::9g57ޡ?N @q~-jɌڠn@&,Խ9sUlus}q vtIhh&/5ݓ7uj>VDHq =cTAzLqѪjUeaVՇyPU1`H~b(R+"M 8f"cj,Dʗbظt0ܵN W]O#sHĴD#Y68/)_WjZ$Zrۮ;4AoRANm˪)B(X[SrĥAq2y:dIKX*-sp Y",@ Y,Rg d0StC@Ep $[VHWD,z<,螒b0ׇZ;ho1&34Zƃ_9hA`jJ1`= Bk)E'eSZ4I|v?0C"Q)o&?!2@*0!C w(Ό.P#C.\Fac#U2T {r~-<А`fK j;Q-ի"t]BLi {0{jCg(yZ7p[hbD_@N`kJf'B4ڜi6c*/:, \[l;M\yd- 8ڐ2.P"YoEn>6OۙO!@rs7@lStiy}{$n}2_NagwusgWS{S1?>yNd_< ]z W`햫ɜOY#zf}Sf:.{:b~8ߪl*YZz]jrn$䅋hL5E?n)hR rD6mĕQfoV<\օp-Fzȏdrڭ,)죕vq#7znnH (Vڭ`" =\w_g@ ~mz &]hbB-'-%Ȋ 15;̨wIdsK_P RPJY_Rʜh.VVI邑TƖS>c3وNQ)96Nш@*8)*Qt.E2#ƒ,VK\I=}W:%_EHK`Ѓkٓs߳,1 ܩ7в\!0>%s+%rhvIhm+E!1R$3Cz75haSj%)o% d"Ҋ ЍbdO#ap #[+]40Gmi*T ĴR>٦Sp9K1ȱ}q9G4Ý_/t9.E2uꀖvXPb#:}n#z nn]H "b8k78Qb#:}nc"p3VoV<\օp-MG=Z!%<[QM=GA,*y\j)}PTYʼnam YAR0J] y XI/3ŀT 3$B?,HRj&$Fm5i$P5 ֲ>6끧  Pz=;\B^%xd[qDGnNimۈGann]H bhc$n&b#:}n#"R|j.$䅋hLCNP̅2Ki0h>JǑ(K4RRH4ASh\^@5-aT3( a i%#EHAs(,8  0ǂ 4k5y 9٦I( G7Op'幧2_EDġ;$h;ddr4xpy}2فp-)=G l$.J1>h1W Uxڭ y"%SpP?J Qv<@zqeߏfߠf0h37=|0:Gv0Czz\UtR6C\)߬v Zn[IOɇ +zQ!Z@g_!3Gf|{s~*UϜMZg6}wsy hCT?=8QyPCУuΏ-޹q;Ų>\%њ\x'XJp_ziQc*kd#΂ "+v 5>uFN)K hpd>Be02,ek+ʥNFU 3X%Y-y=<An.-%4zvˆ/[_MdKI*&)' AZ&,` No XRA?(c]1ۥB=gEHå"_vMTI.t*m?aXŘj.*PW JA lOgB7v5t孆Ivw^]ѭy1XkI*8;/]{:I[o~]w„s, ?bCy}'.,{j#KɎ8D8AxO8J5E+80f)jHlf 1~|.F D=!$!h]g1YC_X4_ec!<_`)i!@}K[)i%$'[<%0+Qa"žίک2*2(]{ZIƒ^6҂֋jjhH~;:^ЫNBcz;^Z(GпuR]0dS0;I8{0໏I:}_Eҗ1 }җ1 }Y H0 xd@AHaFgv {kL-ؒxk"EGj>d?Cu{[*wvt_il:٭q_-jwxN N^FzeԯEZxPRbDXyezKyς':+V(*KGݎjIGWCWTt R<_'^e\+sYE1c0h*Ij s BC9J%ҦTHwa K뗃-^` Å 'uga0qR`g3kg^b \0 zka۔1obf\'HdYR;5'M-5JYlA[D>P2) #Rj%0 I,UW [ dH=7@ ځ)A*.Q ITPY$MReEnk]n#&fc] YS.3W~62^ynԣ8Cߥ=Wy7c8],W_>srBBT %K^}ߥ1O&ތ}/#w ""A4)':y~M4uXKNh%dz~W['LK@@ y6@KMa.ͱX.^2D{k7)F\(}Xp_Wn;LH),w,>!Z?BLs0d* 8{~SY H)Y<4RIiAH1; o7 „EwBp XdR 6>֧F Ee*7&U`+)!,2)__+yAqE6b80?|{g )wEa$0\K-,),( e|)s),)7vb5}ڬ!C:+P{4"C)+p@Uhs3xoV〔 ,,?ލ=^ 9+ʼnT$,I$*l;?ݾ5eZ:-Г D; \P0)XU2.#!4FM:0mbnHS-@ 0|Mw`cA+:ABj9H@3hjHpev08@S`%h1aux(E{si˅ĵPlpckI`Pp-b K6٘jpYDzA\çU=#Vj9yu<YE#c6/uRҮMcsM ):;Q)t,n:D5fRu!0&H Ru !Qhv9k75əPUX"R1֐DƮӏ|,F>\{ӟ^3 湵{&ƹHARP9BZнIP Hڜ;l!jTbJ!G9lh%D^ fqH 'A pM3wjPsh=ٷe7# a8(r]8;y171r1-Fvzc5; 3$L/ 8HH7H;ػ?9L?p6ͦ{=NOq3%:SzߙQglc`\c!$Q&P0$}=d^ y^b`}F^d;l,{qj$qL"N5X$&Ww7<~2Oc! >YƂW,͡7ղ0 _ޓQ(^?X͌61qm:G&%Li-# #"M]Pk\ a֑kweʎ{خQLqC,8/mc ZcPUrcmU* 9xusRgShb+ 1T_C'vI)>x4f4RLL2~ѪQA*?.K7ZL\ѬO&Ymҭ'0ת1NFtLzW3>Y>JI/;,uOp2W&jn{ݏotk%̥',5*ц;kW(F7}ڟ4ė>fKP6x65 bGܘ^DB/O~f|1sT|SVwamM9o+9gIpV:_? /JO'ogf-Gƥo8֚~[-mx%)㈵Brkr/1^7upb[.s/LwD& >]8N 0k,09*[-{mff[ж.!ZcTfJ@Sv{GK횀tֿk1ɲg&0QN{03X$ؾ.xScE;5x`EN*h=fJX'h?᭿uS֨zVf #]+oWG9^(*1/x<bvHX]bx|l^6w5~qbyMJ6bJtA>Gbn'#Xa&ӣ\ӂ*ߏE]Rxܩ,w_A$zW~Z2O+oHB^>^H,W{,Kv8% 8׶^ W'"|=I8u6 ǣZ_ˎ 1$y=5~/=w}yQ59$n U*ߋkJQXBۋmݝHWɼ״i>h.<> ]@©M;]*3X^?Sp yԀB967fq<&_B\@sɤw5N z`"oaCF1Q-.5fI ´ūB'NgM L`Y,WqOd9?Qn9I*=l ʄ|bYPᆱ W!pĚf?x"$ryQ$\1)FbHB8' smRdrVWSB'llqFo+LVmU _r!+!X3G,,XKWɦ0"uRl&t;l6 5t}j ) 05Ro~tMńDs!'$1D> Oj7]o#7WmZ#@>mH,|hmK^IUK%-v[LdY"9s<#`x` Cpa=^Ek<(2i.,mNf8[ ت}I\SÔa吠#" p GX8nhǃ1G:z(1 S ܢ1yxtkBd =y{x-r"a?;D|)nJ bKb5w8f&&od5>w5|+4J0&yZ9(V' ee2kc-^>Jj#,I9V$+)ֳOjjv /Ow;Q"*kp[?=i,C(s\ɣɴ,,v_,守$j,wëzM7Y"VTMeAQH0dvV?f6rɶtȼ) i)m2s/g[+wLQ\/lt %q-5_o%O %ݎK[<7Q7;!% iQ<%"qhT&LPE'9v+үޭPٲrfbҪ<J^$c#^"9ӝMba/T#?EY;+PZ;[1exE_yc<܉ΒFqvÚ8 _'[&dΞ&g !9'cJhB$K?O!G qڛw5Vߍw<~Շho?6>g;dL2KAE|&N{qڱv!.՞ T8DĨ"ڀëIaJv#cEHRax- m1UQ"oUT^N?qkncB)H:^k G?OԊܝG4D:PRpS-pO/䵸O<snqLE]wN1r J* a)+R! /A2QnbdZE-F./F!Ȓ%Ebڍ߱LYՄ{6'|#'o 2O$ÝaP=ތUfƆZΒz-Dh4rIg:H:숤uMDRjD$e5΍HZJDR#2!&G$t4HjV ̎װz4QbffNy&+| f%dI~l bOt`Ip$n;"Iȹ"r9* θF$`H"I[FfFC%oU h& Ыv5d vS<]]ΩjWSq~U'sUL1* ?qծr5@+v[`!rv9YJgiCX E6FbZY yaTaUU*M5F) qR՜caAY's6LS0L }D&ǜ n7Z g9<4!lpzb̓6Svk rۍix꼫>تԶٌV%0U/(f:tŀ *,H~O6x픴*`5k5v y#QJh1 STq^;M3ju0dAH, &Z7BPCqu5MW݆%eTX¤DrbM`G94"X% \}@FhYy3\OҖ#nИ{PP!HxevNhC0#.jYHNGDhA a#i҃ ~HHrũӠ @l 3uT7 mL`XMlUk)aT3܉ƈ(Z\%1Pq%2,c6\ ę GzG+kp.88{W*z4hg6ZTRp#ْ {'FA53AVH ~ݙ~8n>m)r}wt ]0  *z|9i8MDZʨ@T<.8 4Ӽ]q,no Cl@B >XWo#eᠶJ琾捄1s/&t+ %J ٠sR-ou7 NƋwbPD#["aرP9)8Uw(`E'`L1xp=8ES]{[ v>?R [Q^_f0_ǵPT8d\<݄x|;Z\OVI7>W 5\Zlein]cju5qDu3#l'Ȗ].Y߼[moB+P~T)=Y19TWkHD[>E < >h#9&zJ[I8HMj]awEIX q,{A_3"9$RP)#@$un ~T^-[H lUC<)}m|<D|pIKd$c=8v*Ɋy% Nנ$ ’Di/W3/ H բt=(ԋj*['WM=_b=3NfPʀNݛLڑtetcXDgկiL2ːtA;6#ywgɞzݯSL'O{{GGދMx}^|ZR;!= C6r!C)fp,Fy)ؖ$`$sG y k^̥\ks/.NK:\]̵XX\],0XX\]8Oł답Z%f ..\],XX\],XXb|v1(a휹],1L@lM4g9 "L P鼃iXJ)$tzޯ],9XXk^"NfKuq_]XXctv&BbFޚldA-SisIC&ûyE;AKUg𮖳TƹXT_>K@g }3w-OA zeӶ Vk#!pwjbnyvR٭yڅV:hX 4dW*xwA1O] xrמn妓zߎ]`g ;hޭ`jdd]``]!{'[@ Ǯ Aa|_V  1NMR{Qa24Q\Kn n^ԃnurmߏ2煚A?Ρ$1l7[9 mШ=ދ8}z:G{G{j.yQ#ʡ.Uoݬe%؆!:(\\#05꠨&'V2v?Kb2Ndv>׵J.]m:iIv%'Rb`;(f'S=P}xQIބЃ lhV6An:aLÊ5[{Z 8א,K~tOs|D]*z֧xhӋ?0x(,CGS-Guq٠&H#+*9j$>R=Sͩ8OL#OD/8|yn+\)ӎub›J+$*}yM=ڟvUO6 [m[YKf7tNd_n:l5CRNr%۔dGHb6,}X4[-)XvҌMD^GP_6Oiv##}G֦:l>a3Axz"w k+MqKsz}fq塢I:}G]dZYjU6І3d my`'׾9a%(?0ZθZzIW-U˧,Khu`]| \#ʃeCuKe{ ~@ٳNtoK<]P={P9°8G- v>rzvسM;)GvH[?+#_|is .[%׹$ ލAtc(Z S0j Ɠ]ygDF#)]=}.B[ћySXo H[ t lnNޑNQnwGp3,|9ЭAOS(kgz&Nhx;':hgp#oLr8t lIsBPhC04TfZA~T7{o%RRn|=T >۹_mQ+4y)}[b\6^L|f^Fa}3I^͍I,}Gԇ`{3_,KF?>I47~i<пҳQ1]آ|)~H3F2ŏ*ucnM1}ԱnOWix>ukp ֭)!Ӵ:*zV*Awm4Ժ!!\DɔG3mqLb":Mcݞ.Tκ5hukCB6jc-ܶnJʴ)!Ӵ: Ep+ nmH3F2%?y˺)""#)!Ӵ:)Bں5hukCB6)џUئs=yjWo'<g}!_vȗmU4oՌ!rȭlU}iD yxC^ 9Œ"D!gMMo7bC!giU} TdC.Ȑ ҆&pNzWg C7Ъ&a]F1:`c~1f)zcXvcnW1S0bCMM Ř)[ocnC('1SR<Ęs ]-cb̭j֬1fJbCMMŘ)j11V5kԿ3C1!ܦ&0&dEgcv5`nnލ4?煽F>{a1|,l~2^"4)JHʼnHMq(Zk[ f `7s]Hȍv_N\A=۝4W?mEW8#A˖`fF垕vQRi$SĻ1f c#}!CwSrOQl:+|v|jA'yDctOl,˂[Bs煟3wzwk3cCJlo>*ՇEϒhՆ =e  .̶ok)k 52˭A;kJT+_v K[ 3$]o3`^3Ets0]IY!?7T~`Ğa~rEd#vΖx^\vWXF :)6c0Z)َ> !,Pvz*Z^ok>E9;ۿ$$ɣf;$Ez^sCtJIѶSvz/LZ9M~:$jv2vEG^vm7@ͲI{\mմP$FbK/dFSq4ڃ/ [m=VJl>:ӫ^Gl%l L, 7ԏD+5J(J믯~~GF+A'q&/V_`1ͲeՋt 0r)U(#=F(/AmƧ7&;ħOOCZ8]={ ]9RF$d:/ MxŒ{f]/E~Ot}q. ƏFEE4BΈjؕ_}@ XIR;e0F߼V+#հK^zVزihpAv2F ̈́B1nSbSSf4Pÿ):NSaiR>C e;iqk0m,wU=O0e}H?L>=1NS JRKcKK4aT%'1sC')-8'-q s$+E_{ !VVPÒ8IQC:.u61JI5cS=46,il Xlfm!&BHj!F⩆F f8KS,r,uLHLC+{ FsR;,$i up 0R7P"#@4+ #ř_ &!40ZLSHjX947B&IюIESN"r,6*,jY`FFZ 0J. uD82NP*c rhU 1 E6H(G dkl0; >`SNabSӊ?ѥ[c@ W&B8)ߏD^ ?_!#!`NaSĤ7r&1B T?pF g~ ƙ5)&ǫ|g{0Wpp XD_NÂznrs1e ;M>/2|6F~cuf5z{.{>Fi3Y/w"o?W^&^x&!U7�$ˋ|L/)?;wL^n~gt {ٿJ,|](XLGM\oڔcp7ćk g+_Sx*5tN[9'JX0_ #]1KxW تTڤ͖[W(wd>ƛeߍK)q 8NxV`}ۃz휏c4snDP>]UwU+lmvs{omlmF7XWƼ@ ,BjP-/ޓ-Z?(Xkotx^,P]̰D` p`]ʬVDT l. 1'qH9|aZX]1e,YYf>"" {-WZ(881TO+P^z?_xl(7e򳙏\6?Mǧy(͊(1'IV4")o ]!(5ݲ]DCθSxlKh3#{9DU7N\! XJfT?l6qMfW³$Sz:Yx :ea:G,R`R"SbGMHY2Ь̄t`L&.;;61E8l^a\ʊ~PL>M6 QZ=$x}"!+~B3S BYL4T;XX#MHpT[zLJHNr4I&U Ac8ISI"!d1JcH4s8qщHd'L`J)JTܔ#";!x 2ZImaP0,3ߧ=pS{#wfуsod糏Ӈ}@T^/ -;ɒauxi\'M:#djq1޿*F~?G߆ݵ޺t%)l:jC@/ߘkS0rѫw/!n4/W~7]It) ?XGqrEbo--jL(A-{, &F-qqR(8.# ☣Ŋ`1R񗏸' I+Y >B )L6/Vkh@WEX[)8$[sk@)e "٢/3XbD}Q_b"O_(5Wť3QGo+)vw?w_\g @VD#o-#֫6ƷlGKME"đ#""QbSw\K-拫bΌu^b-k}tfK(+yijK4[Y\z9M_2f*[3Ȁt3y(2;AijamE )Ct ->E l 5J%FQFu HB`R(;*tJ$1Q(E{.%ƕ]A=}Rgq̤.2`f*.CWYt-:fmC -1,;ysO*|#}7}M£[HofնQˏmw*-H5"|\|6pƌ(/7;87 x)P.;яkA1_yްNs in,7Fu!V$qa%t1Bi%H"HbcaNh`DžD0sH3y%VncO:J^ >A,]Cn A$jSL ݬ1 ⨣pLY,$ƩD2åY ȰX3TҀjwmq~ ȼh`fgI KvZFݚCvK&4n7YWELΛÎn`.D$AQ8 `qkf 9b0OE"ܫ=]AbزxA}Y0PFI~j[𐋿xZvr7*&oݬ}ʎ$O"!l;RJ|@Pj8KlXZn%u8daC>gHBA1WC"r2NT@=J@`MOeMoier8F6Pp޿s>:kk̍Z\;Kdkd.}մBcYmč#4K ŕ̸%3R+MNch_9{,sViV儛"O1C 8'tnm"ɉ lgBpςvXE~v(t ,_PR&H U)Da;+1=FJ9y4G!e[J &܎(툦5RqiND~j{*D/U5bM>G+j!uA0*Hu(0MDE9͵8(_m)DLdZ" CJvQ#@DOSnśY:76ZizBN)#!JMG``BY$qi7*TSdoتuol]R:lX# } !(yj9H(rIPV%J '? pO>[CP CH6 m40Hcd9 U[2gL6nS-v795>߾^筊>1 Jc~^1f,ijBl#r,do*VoJeN0G2U( s2* &B#UHXG&V"`TbʼcV'R5+n-Q.JZ}Tkq;H$?T#|!<>zATe'{xnU(]dMí_Qw_%)lVS]vËuU9q)&eԙm2 Q6Fʂ,LjI0.5ZDQ0,^?kpwD'8@Xn\$Gp{z7I,M_$jŷ i_4cE <)Xػƒ!\--yh3[]嗈=jm~OVe-I-WT9a#VҘ#*m 8}GWѓ1gmԷ(e-m)Үk-0 fsi^KF8N(bz*$8>TFg`&`|9KI&A}c s4"ΒQ9*r^5 qnY %~{ _A9#-$d~&>IMi1,bXŰC&q*T*ZP=A3`rZ؅l(.@](Zb u8ҋPl?=8ǧUHoˮKpxuڥ!FoFo%N*Ɂ;X ۉ a .BWqj;MKzʌ~n%S@"Ѐ2 aK I_CXxN-sa뙷iک@afr:P24‘I`1FK0H8*[FlX ."[+pތU| b2Z9SxpO-q*>~RkCg'Op#Lpiw-oek8s$}s"t V}9YI !}D#h,n ˭uAmd%yKC36XWttTk!40k7_4ypR~c`ܡc9PWD_Qp-)FJOǀ=Oeʪ[(U 2bҷ)&JվT(C.t;eQiZa(jhQmN :/S29G"v*`}W1BEy`9W[j;ۀ$ ɎK>6@]_~c;x1_s<> oukKMK+Zᔀs_N@RMU/ĔOUWjVseLɋw_2ޒJ|q &{U/Je)lU^ _< Aﷷc4|Y<18 ;x2~GTaz*j7UNJ:QTƈ(PdpY5$`-dIdM;&Ir=M 촑xLIc/.i8 R%1qGld1KqEH.क़iqAr8ʊn/"D%Ylv8N%;%dՀF{![CRplMäҍNģ ૫, ߒ)o#2No׳~_L3-=kYrL({3M]<9KiEJk,RZcTΠ w;B c9ɫ\*9QVftT{@$Tk1$ <%pU)@6w/G632=/Jr /?*̜33*\! 0}Уs(Vh!Dsj98^oaXX;5Q#z%kB^%sS+x)+>}iQYډ1^#t0 XO6չַe4'W%Aӛ{/Qk!gL]CGFtx@"8s`8'yDwPmc>!HtEC.r:d^(DQeVx mĻHF  IVTUX; h)?[@ewE)Q"4Ye=ZZ~Y3-X [Pܰ ?**uŽ3iI=dLV;P՞V;upTf]b)=7{tox/^>87iDcb16X$VB/1"3@)H" ylITeҵm Q$.ΩR [ZiS1rI1`ޓmtWaϒ>b]lZNbp8TR7ʒcʔd7U$"9׻ߛ  i%#nB[cqJKoCYjZX`㱌*򪠴R%NkqYu˺vMTT6P3 "?>^˲@&"T:nyfV܇MF)MvCp`Bb+o7C/64`\-G{h] t cv}}ce=H s:j\k)tV9[w1E&y*R/Lp0^{Zo;PlKw3Cy(ũ,+N<7ÕmQm5]o١t=f-9иm 7mp@vF g%68V\u q]m_B&{䭳!B0-fMR̆6Vj,p/KoIۥ83y߳ohZbw`T؝6*vE1Xa!;K7r0I$ E|oNCir;@${A:փz }\ѧ:Sf,z`Z`v^7 0 +/&pU1O5F$nl@i{ TmYӊBα񂹎Vpa8ʵ#Z.& zw9)rw|IS7|y4#[740vƙXvw.l)N37k+oowi%i֙,ΠtK,0A` }wPa|s+ >#/+_'}o,0R+lIlXd#eF0c2<NFy[=&5Oiy,GP.8t Z~U/@O_/ 4*ǮWAPD~x=|9|^ r'd\`7a4OFe~I?gM,DMEBi_8d Ւv G;lid~_ o)O`8ͽL2~qb!08(sH?$ ȝ!AZbN ~}rෛh"+\R 6- .tqȶUC|xFƌUNbM\U@dL(9Y|Q *b_*+:A >z5 d.AA7n66!*B`)/rS mXnSG^^̚=]$VLԼ996+$$IAY7uj |{F-D[KiT(Y 8ͦT,&m"vOY1Z5AJw5jp{7Vsp;)&4Z\N[ n` ",⽋mNqMA?\Qҭ; z6`w΁@ /\:1&L* @n~LIʠSB @m~BtU!G`mR9"a|Z(X [b$u9BA>YbMґG(Ȋ婎l".e}_ x;?Pmr0M}1 HREW= H6F@J54kDMdA 7kDm 96xum2+ƈTdPB Dk d=a F kCZӴ jt4Qٖn& /G\Rԋ?M_"}\HOk,EYJ,eՊۨTpbwY:Oޫ \ܛ{ʃ2M>a*J 1گ?~i 4\L27.:|5r TLqZVv`pyeo./y(?O|'{YHwMm;iI~_+ۥ(}i]PUw0wz_wWԚĮH7s24a&};ÑF>;w ҵTmC䰊amoR'))k4"Tp9{ "LRa xX[O7tiYOWtχdϊtzn?,Gzkƣ6tukn_-j]ib bz~}WLV˅iG`]{^'=6ov-6ho8S:zn^#3lEpݾ!1`˟_9`4M !{ KlX]:/{᱐0Y=,/K@9[Cr` * 1c{+u{pF;WRHjZGXlH)oXZw1B\ҤaNJs쉫W?BYp{wMJ$ZiOG[&Kݿ6A ${)6@;AamH ~ Xm{-IBl>bm5n%0wcХ &YDҒo!j3Ge@:-Ηi!YPЋeL8Mn9= "k}Y3KHVSTl!Ӝ]jiۛfF3*uJ@nJ*@@뺸o)(+gz䉈K={E"ۣǀ!DE !ڄLte3YΥtye# qyKOL޷YQYrr.Cby9 Ǒ6&BbytV@`ӰʤbNNGTF1EAJ%ʊrדD(McHBcY!18oRy71W*mJ-1U_V" K*B kc,V; bcb~#T퟼CqXe"RF՞Q;̱Zq*MxN8]6!/:2!GsuW˅ R}gΌZoM)2>+I (kCq ,Y=iAn@0tj+L|J9p/NBR ԮiS,PO)GwkXIWMٻ涍%U\#jm%HOv^j+N3EuE=$*\)ə1Gw$L$.wODk05$*N$+V,SO4+,@I)' dOYl ]J$FS"Д!LG1҄$7",4JB0+QFJIgø\Sך=-\QM`!w`:Qv0| ]pqW.X¦ Śix9)$\҅mLZvM9;IZV: IGWMiIیlmEhY>Aҷt K_4q swhl'Ya6Q1o L?I]2CX4:+'jL/jL8bN#=уru/ @@nJfyv: `Zj} x*t{zʑ.ddi2("x\ ӒѸ7*3ldH~!4x{2OڃO)C DExJp!Rl\fF>w6;ZT;Q| &6m3tp01GĄ~ Lw]юpÀO/:+-S5ߚC-gR+;<$ ɰ7؇0 ͝ hR&/:%j?:YF_|t:s냀}:5݅p\e)]ݽ::O׍=6uyhû pO[? t\Խ630 ۫_ ^b\쯏ҟ;E1%>i vńaV/&@uX}s2ڍJ)U`hOKGpS;BT)kGjR{J}8vY^(x2(5cu9!P,jWwGiÕ/uUE( YM>v#(9:s;sDzkĂ1E3ݶ* ɍhv $]b v gp'ʇuGq,hv{U⧈vuuK(MhW$v{k\8ǒ֍KK[' \"D{hhvnWF({ԈwmQ) T{𕄾"DRqD(}y9$)ȎJRV*#`ZJlQ5aS_f3 5s%?-'ۆdHoBU5%hd $9AN42Ǣ\[E[BFeZ.AP 3V!&es@= :#m\ O%q1e0 2峏h$L'Y>fg6\"ȹ5+X4<^!p}-)+-n&Q]Zgl)Sc\ /4%b+eA Vj|sEŻiM9;2Qp{e$)G qJBBSFeGKMC0l ih^ 7 O2r,8<}~|^frd?t7k7gGOx}!1ӹ\܍>|N7KɭFnؤ'8@SnBi|.W&p٧g#P-4y3n ţ`{o׽'l=,>s'2Űߋ;p3M wK_ {}q.4| v;}wbܻsgo{GQu38MvǠ*?y8'=3G?@mûyWgidOL?Yk?7#u_~ ] qd]F8w@u_/0}N_vj\X#p*ϯ@g]c^@r*,M 7^y0X*H'wWvC'j#i b&f5ג Rri#eY]=̥QO f4/Lɔ݇s;C&fF?;/ϽM})7pѝC1s?>M8I br+\D|vJlPq}>O{ (+S:Ç)\{Ɩ )L%gmK b|m z4A+]z+?](K:б C"T!I+* 5Z$Ҙ 8"JE ˘ ƑFrxMV6u/uTpgëse)@,q9KbQ0M`T"Ɩ`IpYPPŐ(טQldE Eͩ"1YI%hOR E R-qYÇ("$q+,-eШ(I:THCCm8'IRk`Xed7=ɬe0;PҀ! P(X!a"GЄC#QLbX&c HiNRTR:< ,Rb#)5Z5"z20ZO  Dmlf BI€k%m, 5 h~Ez=:] Fk Om&*x_ q_FnG0U0nwDd&{AB%ySof#D驗IzL.:|uj' &M;/A=0\qȰABFp=7o2J#3Eۇ U Λj5o0jҙ\q3 qQڠ9;>S^\m ӄjZWvrU5!ΤJQ. n1ԛ]/Vq_h\ Zb_LU<_hZ|8QU+N{XzT^qK#CkupU]ǫ!eCe0HVM]c.(TdLׂKT•lc*p!ZցKAXJa]1QDܐaz@D0F\FwWSBJ&Uh1' F9%8P2Q4\d`w@p[%g6ߵn(+4Y`b"dZRx! &AP9t{!D2br"9@>JR H R[I!(\aJviryܦy'$"-JH!ty}V%( ׷ ٦pʇ=`Et-\G 樬l5wZ!\*cަ?5BJ"Yp;qiA:JMT/?ޣl "ӉƓR#O.p4̵\^{-(3tzT}L^Lљ6/l (P :t'qG"kLx0!@HD+p?D@LlcmM,ͭSsCyY՜"RLbؖ?@ T6t?ZeuAΣrh5[ J9P|@hZ)O_Kp! + Fc%EWmF̉h_!x27EH9eZh`0*X:!fZX% *&Bf6l : `FLܺlo陼nC s5T Z*K՗Ո@g,H kZW&!h<_7>| RջQe 9m?ސ+@΅ &/ގ$=)i4I’ȗdg ;4Wl8͈9͙'koC@Xk#mGCrTl*Y\}QjJ#=^~K] s۸+,Ծ7ϖ[׮d8HHîHJ88UA4`7)ɜ'$/V$e@V4)Q <ӷK!_ԛ'/O#߆ 8 ԩ8c|<}Çbxvf//= {:|AFʗ+vd  cVɁ<Y(y)1_B''%p̎ (ä^{<>B#&#2~<Ȅ_4U4?|<{yX= }X sɉΞ@JqvG4N'}oDJN.LJbȓK{7=ٵ.1I41SN'bRc;;zpmy-"IDf3zWð {J)is׋%YHaAHl]}ܲ aG`qi!5j&B3Ox 5 2̘L7{QR$)kpmiíV( ex ĔI3M͠$O~!q0i.aM?O3#$~z<36?N,nb(o5 bε(ȔdK8. 6+4pE{ `XcPo1x2o= >dke@XzElboQs16 ihPO@AS&f?iŽsIzA T" "M$"Ա-tawD+.r\:$;300o Vhӎ*WJw pqݯ~EOW+ , EPg*SQ*K yuEP4p~bʇ;.+9(vO~%>7fBYd^ho/N"#! R!jI,~*V p ,iKTR;ajx9dSʦK׃ Xb񃄳$`% p|Lِ$Ӱ TsxVUx?0LDWp>MSKOJI Q rhEw=PXH\I^ArбNd8&^t֔'ISC~Gz\U/F(|ꇢWN!MCǻ5$3bgÉ8pؑްXp |e*p`==I #"%$жd!M3hA`6SP+;j* m8L%bxN!X`'erU5ypAsꦆ,g#~Į8Sག@p~/aؑBbol |Q;Nf;]*{wܥdCHw?w Ze0ٲM2Qm PKIV&vE'$|Р&%IB*sC 2``{ߞ)y~ϕl΃;F'Kw->e0z Õ1, ,@ 4"*zJ8U4oĂ$ XȠMMK⯈;/3w%072|YGt, BRϏ6S|gb!Xh䆎G# $[b=~ Em*HgS .컲n1]Zh5ch\=ĖaCL04GcdC;pW`m,˰m^ n)ٳ?1%pQ!?%hq{Hk>46tIygw l*y:.Rk7ȼ'@D ݖn7MIZu}>_(2[:Cg⃩dQvf*Rܬf~(VFِCOʨ21s:.3Z3$ĺ7iKq`@ $lߐB5ׂ\d0*vorreDmھзu^$r[Tau?$$bVѳn6'fh/uktȗ(4m|OS] JB6=H"百o`nŲ&o[}q:Іa1iS~.;@qr@.z/N$ʺ@~3suz=!!9M j,QDiKx]oem/]jrʛiKcZn4]#Gm$ujEP=vu} o Ԉw֡hqfFuhN6]n4ZK _h54%\#qFv,i VCy0 {ve3@^J 4wWsWl/CCCxBt:*DeQi-ACM&o(G52 sd7uMWJ^SDF`ņM=d|\%m";6 9~J꽌Ƕ 4斏:0TњC O6籹ATY*C M*@-f-4)1H BL1J|I$fFw<{Hk)\ |= J MPBME!0'*DBF\ZȌU ";ՏIo qt@[Vޝg]ѐw[],%7v(+#)4[c(E`@C6Xiފ_2oDnD4ɔdq`ϸ" 8bɺ5Wt w=z7v)?8N`+dkoT 3'"zv1 w?mN+JͫQp+Nhy;9'f0w[QT3tGGl[䦜ʲ- O2,`L%8d%bIP,3<@AS RfzT^zF*RF1 G#zw8}6b RѩTֆ2L=qr ,4w|j` B>'%/y'R[_ί{Q&UaJBiyO=GIk"ylO}V/;z}./'%Y0/w-+\])\v2sG,WNU~.4ϭ\)8ȼ7|Q^+on7PA+TD$dkɁ^z%SZ-~94hw#N}<=>/2$̨::TG"WS2,t[28eˏ+E!=GMs2 Ũ,*}O`-O?=9!z. @/nt̿n [isAI]SkO~/:gO{_^>;; p80{:AQN~pO/K0LA b {JfO2rgSy<,T5Ï20lx3w?_et|zw}: /z:oona^ew+=ԟ 'k/ϟ5`Bߍ|7wd)Z}ܸkx. CʂɫᬥD; ILµ L? W3>ԏ^@g `<Fa|6n73Bo34!>LI^Wqng(9Zւ_! UtGdm JhaH nj_|n|lBWZ70c|=:q%4H3čK1Z8VLrPH4)u7rIÍZ=%!.]LQtzk(`ERĹi,f~v rt/yb%S$Kd8zt}< cZCOzƮӟLYTbw<~)FMN:9'a5H܍sR셥 Kbt.]* s0Atnǟ8OvJq6an, f a"pa&5 ,&ĉZS c@e8eiySjPnG5$BFȳ{PFck "3̵^Y/9Eأ*"AE|9v!ĦM̓O@0P-:G$'1u^'^E.cO瞦L n=4NXt 9]AfL~yq{r9N/?"Vٌ{O_>+r\O>'CrYcc5<ŗ@@_!{IRyn0^4N8|.RӳWWߞ=Ib CH///~< FzM &- VDr' el ͏X^ne/u o/vqЀe%_!C.nǗKm%.cbF_A߶Yn4|ƚt|J|J˗Yࡄ f6B]@y}ǭ 'zC^% ]UmvUA'mV80k&eLn >or; VTf3*hmD=}/ӆS `7s hMHcjIHvxl*D-9V}X-?)LCpLy`\Ξκ߇MJ(COY i).P>D}da=24% [R!A[7=ohyvpx?K=бuk}]6xx{i?u5qS b`f]}LO|VWT fT)\Zs-LHnǖ mHC m3ZCJRf=DZDVm6`BU4# JgⰡ ce:UbD`-Q my eR`0=E3Sa꽠CZNpG)ւ4N+">So's$u;u;^i%P3?Jqa K>M# %yR!Ǣ/j Wig$l'Nk ZdF Z DmDh2UxTSӄ y H&JT=o`#=y#) N>ҕ)`\9Mmƙ#,34XQaQ ,YdꕡXT)Rw<c)W_HtsT,2I*$I*$J$UWpT(5 Ҿpx$GIQ.M\j8bZRUYI LwhoKx7ޕVDKhkt!ZU$ZuQF[p+cٸU7%Mzw溫@-q_,^DÅF?_̘.@h%}|b+iN aK0UIy<q`O|FPxl`3am8T|D8kjR 6/a[" *L8yzA5cZaJ _ Rr!ۃX9D8k!•"P3ʐԥ|㦢rvX6ynF# y>Bt~%Ƨ)>H0pe: یۑ GKdTkxPtĦO7|AGY=n&Pafh 9;7mroDL ڣ18Q.僕6853L@pa;ɔPBud+͸E k0S5;i5s%Y,PB3QIU&^$gdxGo_稓ߪnO?^ b ӻNo Ի!0*(T\`{'{+ex5Ni{?E2Ga{6<po7ᢚV3q`.k^ [10!`a&Hq0&(䠒MX:Z%A 9o 6ኈt2W >!L5OAl =N䫙S& H)L4!u5e[7vE0%]d?Z(H~! 볯jp1݄`_믯M0H0b6T"H  yZN*-qNU@vRQ:m7k{@z6`-~ ~ӄʤێhPl )U:1/e}oq iE<>-2f#ĵ2Zpа6jRmHxj@RE$43 rc =Ǔ2;8/Or9]q0{U;".pփˋ EQHC0|,%CR2#dʔ6ii ՀeE2eK +b_"W۷CwNi)!kw0 ~b! +$ʪ;V])3Oz"1%(_T1rF F{"Umm̀Z0 :򋞙hEOê-$=1'*bTI)&%NX"ԄA4$AeS*D͘p/%y2$!6KI!O ۥsfFR`Ty6U$ 5Onђ :`f=Fř@5Xe.xNZen3ʗI}5̿^l+SqGI3M@_(cʸfBS)C)hZX343lϬ)L \}_7▀֠{U{{P/qϋRzz^7`^e^\U(wXn׫kҞIskZCyqiaC9Gg灆CVX=/uQYUfoFWyʻUy{nƥHֶ aO|f~6݆ڝ޿?u 櫧{*)V[G2I2SDž~_X{6zJu)ѫ"j~Sx)4vO+^r~XĀ/U%k͂C];6ÏՕ^.n?j(XcQ,UVw{AТdhq?vD1I CR9ıP iRJ_+|=-8G@%!&zcoz tC:?K#bh8K%TTk`//e_nN;]Y8}Y4˼\@@@%,Tbƅ Er,F*>_(S_R%;蘜{τ`8ZSW/FX/qt1PKynBaeil/LSe2DCQ|fW?>՗ȸBKN$ ^sc٫8@~݇tE@b1!{ʋg@ATjL\pJȳ ؿj)Q(0!9לg $.L̔P$Kl7 I)h3 CJ a@fv)*eirT rԅd[QRZ HmP6;4nywH+nL_"_գ=,̝'_Y-fn[7)/vZكL ?7s}UvVjr_s-^bg"/n+*I:Ͼ/(s-dǨf0$KGE ;Fg \!SqtS}R .(Of&Z??񥓠1VLxW3Nxigp 3DCp#PڔZo=Ϩt-]݇mu&w`u푥lʩn _ܵ??Lwܮѧ@?|KP4J9!,IxaPBԵ{Z?ϔxM9aMkZ2a-TL侻?EOb| ]o4+7͊njVů#>ՠ8-6+7"t:~?bPAh+Bj_O>X שB\#,Ev>B c^ d&gl"iW\woodxw_?=^/F'u:Gз x;SK-F:5gjlԛט}F0W'=)lYL.YyXS=W~d% zgsbo2;clzLBSW䕞ʙOr 0pR%@u!K!0.~.>oK=Ӳ~<йkٯRq#u c+d%"(#}7>B\J㵚GMm[D9Pq[Eܖ ! DMoܖUbB #tQHQށ[%}t& ! LڗE߭nW {ήc@TWU,y}\۲mnUEpt1apH TEɱTZxHI!Pd@0)ԓnv' s~"}n6^E*wװb==;lux17 ;ϊǒ}zG>7mr~]bWN^#,bv9^YQ:-߽m'+F,?2S>H Y'9dױ"-hJ왝&iK)4+!>!{ Gi[- Gp 7gCM$`Oqi=}椏 P2Z ¨( Z(3W]9GQ.4#$Ϥ->Gj C'^hF`D pM p]9CRg90,$PŜ/(J%ImD(s*^GD~Q!a$y" ƒ$u/<.!wslF0KH~b"$aJ8骿F(y]A tcxZqdw% 43)F t9ӘmU b瘦2793Ʈ㔛\SP.р9 *$n۪^L~($D͹">]9H=+ 3H>3/k̦۵_$w x CNÅU'j@Ol=ݡNwX?=8bt(A!ñK{gRtx/!~HPu(4ƮBݟ{B߽ۚL7%Kϧ7 ̡3l$\qPxf)rg fxCbh0]\s2gJT @*ɲB ]wsYV쨠h[KM˙ sd,S8," sijtNq9UR!5Ab=P!/U2hV fR#P.StQR RbN/"ׅP~/LĬ!2y @jF'jkhGjZ4_,^/jp5̟3kW-UL1pe:.E#Fh)ԉF6)S[ݢ껺u1C)zN1̨x[E Ww Lĉs;pwAs_? lz={;wFCdו*KG0+;aWX :"l(]`/ 4hFU[3ҿ,rbZ){$]!{{EnsR`U1XvJwi(8I"IFјct|-yCZ=wY\6T*fu]AK׺ֱ♴Kf&%h&nD&0 є%$җL"ŒT3I TZJflOyf'Pe45A@![K|'3唾l͓c K7& 9n={O.UN~h~@s#,ì,;Fʞ7R{K߆G& C.ȔA@7¼S\➗$.8 ~@x x/~JzJ@c0 pϾr{ {DJOihx*1Ca>F8 Xr\G8t!`mOPHI1 LNo_/Hޱ\+ɬIAB5ǽ'FΦ_GS9 ƕr&&Dfwt΋X?=8bt(er%3]r\K_G+_>V 8ŰAT$'>yB2LSP hʌ'4p*ԧ>zԯaEz wxqCAG"%'iVnOx*H0F^jϞV(v,!Ω g`rf̤T 4 L\K,iF J s% Y&HEYѣ\%v`@ƖBMߓ?B\)]-rTQᣁ G0R5$ړ޿OZQ@|Wx?K1PA?ZJdV+d6-80u>/7O6Ez]C,zǏ]˻LJQQ而NZ++E;>B %wܻm< P6蚑Zh,1Ev#8*x_+!2cKGD<6Q2N[X7 ;wt_ZK~dzw-i]n{[P7׍%9cIF@"f;7E DK&Rc&.Fr  C|\0#f](^r3jvq_P֋\N)-0ԋ^\>WvH #CnE0U 솤KtS?{WƑuJ/ac|"K4BH3id <3[ۜm kY8, }:S`,HY  jXHQ `"hA/ƆЏΉP.*0>Qز]%"+1LQbRM-AR"#y5 ' 'A1 >829_!HxY^E4@8Y-g IR>573|ylh/p "$ewx_>?񒒏j&ݓdǥ_{tڧd׻+P7xF4` GݏçqCږ^<Q>Ɍ&<~r%HtޱKb8< qd1!Xh40ԲH‡v@DsF!(N$lWN]]QH$PX_x.(L,(~Z5UD`8GDj٨?*j~Lj9[q")4DZ5)PZhҙ  U"QJ"婊KV i=g5 Ja-7Y2碝#C$~߇~ +)MM$A<:e4?H+<|fiy>9h;qhS {ocygBϋ"0;u=?GpID >S;>ϋ4B/,_NUBU~x3OsPQ,Е}I&lɦc֙Gu"OupMKʐhw/\mt7d @ 9m~xuW2HIq|"_FNc^M˒(QնWdڽ,ZrD6w>IPQ wF2 \҇TōꚲeyɌSz%"B;wZVdNk[2ɕ{*U-Je-K;/QԭRum.Όe78mJpTwkK.? UnBSh^*"Kmݹ9k%f~^1 _@{o}>8|p!k'ѴkZR#s%#bQ7Z`@H<}5ڻͦWQE}Nj//DI8y?mQTP>fh5D&YԧJpNҙqҩ ^k˙No-5,RD $!$p5;88ֈex`֒+y/l.| mjz?~~gWs;W;X>_G*+Ss^b4yJ{ęm7Kؘ`_&uX>!⺡egB~rDcˢ!x'N$n+oA⾋@KR3MIi^$vT볛>+Lj":3(X ͞LIY}#'; %ya~tBƒƟN쟶>%P>,b3K_b”)b).d]xNj$$y~E\YNNx'܎syKRuZ"z׋!oѶzŴqX'bV~kIjoye[]y׿ܪU3'sۡAv.lAu;sW7Nb#bEY]롛J)NjUu/[tPQy4 MnP4dbB?&-"[E&txM٭q+>B@Z (* NWVՉF]S+NWS;rꄆZK9ꄄEuBpU £V2 ?4v  [J sA¤+(G34N)AKLP+7n@]U(HGaJ?XmYK( 'n) c@C2 w jl޻u+nݺem?K(ط"2FYF:>6@԰Q1`$T1(_ ա!$j <[uvOyYjpr9yFדp-\K 8aiHB| +$ y,!s!mp6mZ y*D*s2vf'Cx6s;Knv6X4W`~qfν-MG5%}GZrWkhjGtf  =DgI3gu skf٨?zJL*Uc!߹)}0 @btʾwkXe4n+׻Ua!߹lSQ̻Rk:,rwa+GcjCNShrC{mYԒdo9$o1ۻwŸ~1應>P`-𧻟?RWi4;oSf36f]Zh=Љ@ ?OaUXN3CI.7cfMg/Oɿ_^%/γ3 jt!ݽ+#F+Q R[ }΍Ўw!acBo4B(M4pވ8A\4;ö!tWrE{i6|1B|#z`DbUh"HD-e!d%W IXlyDt)#"$Y<I!#>]^E48Y-g +73|y~9_=#JAޗ7=h yb<&B{=cL}ZL&p+9JNAi{ÈRS!~IX?~ =L $lF찅+lvoG炀(p ~Zy׽ d%?4W`G+_ {?6k,$nb13t,!1Oqr}u.|VWOOn c/_+ybx0Abz褍L2 ㌎^ՠY~L:|<ϰ i-j!Cr,`B,\郛pBd H{{,'3_t, 5AM.{FTJ{KK-&i-VBP*I6CVDj%zrj5 ) H%\FV}?9 *b)bL|<58J Ue݇練 7xֺ8oFe8cb?g⦃{n?FIW={^3ZFKb!JGZI݇m'2"'XJr%q8ҹ\Is!ytTiTQI7No)Z,( \fJܖHJF6O+]TN, Jha C +U@C #+ UjkK]7qXQ) [+6pHNɀpXՅ\)TAʡUȞ' lWr$i'A?1qUw`p}h!)@D5|qr]JPC,ܒvUm!֓툲t~w; ͩ`Z}qsw*8MNQLUx^qї@?,=ʞ*Mw9Bk].(F'h1(Hv2KđM{vф4Xr@ȶ0Җiۖs-5S+4B3x@HѢɽm(ٶnie50O> c &1 Pz̹!' sZ!'E &5w~9InŨ䐓*y4ْtVrI!H.OQ$am!y'2Պ|PTuhN3kFkSeC2MV--_B9- UӜBe *Q TaC+@j}h ˆ?d,Vl1N$B I: e6~hn|j-aUv*G `"?ӵ [L۰Z6,9-; lHp{['23.vpeՏ` 3Ɇ8ZDvTDqñI%8#:óy^k w˽?8'P=4d4+y4Z[_l\=+BZGȣrVȉ%#6R6Gud`z˨aaͣJL%,3=̬IwE3{KWN I(JSذDvleF7.C9^N~2 73&QA>D`mVSI/8ӓE'b(-M{gB'92ʮ7E1s!昼PĜ-vr )Bڣ=K:b"ZAˁw-LU;r z{σ!(Cd(_q. SV K (,?SgBQDf*(0\'bGHҩ!$\w:K"R[;d}9`N_tInK>wk/!9u"a~tU9keflqhseq~h]e% 3/: L.{yٿ? ֗3Y_F:#Y_"9/̓~~a/qE,2:vqVOx98Rb/9[+"q_f Ҡuc4\fРUG9\4eضԜl7O, *55h~mpS+]O::_H2FY IYȐE5bF&d&@Dn,ń *|bPuQ$u{J?X1<;O&υƦ 3Q$Ce|)!(%F(TI$H@T|aD'g7 Q9GmgS 4~#ʃnOk3'h(1B&!ǒ0޳g,.!VeS e0^/n$|d2~P]ڋiJ.Aכ,)*w^y򈛙IB4<{-'?'{؟?.]L{K>-&plK$|^=aL@N8|8{fl7Je)3&󧑍W" *~h]ksƕ+,}IeP~JfT┪pM2߷H =#sn߾ao< 7V$:[qy#%p`M*Ot1R[T8}i\!^*n,Cה8/abR s@x*a hGb"Ҡo94 bw!, 6]X2*#7q#55Z K]Ԋ>iS+ƒc 8!e 2 U0LX1I`AO 9>%h G1FLJ%@* [!Q)8!JQ(("9[5R0, Y.r5@RrTjݏ%d5Tz8uLo L/v9Ir%)J'zro˟`75X=l{3_nF+75b  lN;Kkt%8t !F3f Jֹ4! $5wd?|\`M_Jsx-Y~e04M|+9 'y]e 3QVi onU,ݾ`pk wlbOiR˕Vr]}PSG>hm[0_8 Z[n<ʩ޶mk hμY!-I*9=1Ydzأ;H˻?;@uD.ϔ`&ឨ<:17m͂Cp[g)s&!.싗MwaOwjmnh%J[tYuYmM]Vb͛ݬ {+5v WS u^q[R %x:] j) M[#&F uܬ|f`O ʓcźy|ӳƎH]Y8miTAPLA˝#5j8XojΧv:*NEo后5ԇnWX1jPtBmyk~9&J {P0Mօ̻ _l5xPjIaQQ Z7\w{$*~г\,_# :7bLE$NV{\\dr৛C5σIc/rg<^$G:-X`j%_1H,EFdMz-2Y=Jd{V{OJf~J*W!.SD>&E7&Dtsu(8r;n7F!F5 cy0%nՋzkhg0?)i$9.1gN. ^Hŏs~c xuQY׷=6Y8-%)FdACF.fd4j(Ν{N[{LI{v Pν)sannD3XRv֛ۛr2L]}[m80"Ts7.si[fU[[d"a G^CZH1ܶgQLBSʀH*9 $ g@W8nVG>!Cݑ1̊3yWB :EZ.?nH8畐I뒭6 V結!A~#52Z9HW$%U+!2qj|B>p)W^ Ip;eYu+aH>Oe92TOH}'13:e Vc,daJRQD]Wg )$ rhcX93*)2(b%z$$,ݾ:%RݾSN .JENU9i /2B50hXQ" S0V,2Ă8PEi"CPxp _-p\%@0f1OHC4NA*APѢeE&gًJ|,uzC!{ph_CL"„1 R4 ȨR@("V4$ C i@T^8w (~·=`u{T)e i0zfD10TJ S"E1)42Q# #@=p eI&/ː7F3%^*CS./ḋMOOM^Cː݇ 0?o)<ʾ2~LkzbAQ6O@onw/W-7[dd?3W[Wb ӧˋ~Xa[Ɓiӓ?A똀$:U/}ñKF.aj78cA"zЧqH#F1Q+as$DF"ET'0JC%#0^XZMw~5>B:۔cuԞdbv//@+ݴ5]}}V2KQbnC'=[{Y&fY1~k/-D| R~I|+35I,%xIYWi.o Y=;xsqY$Sߚy65: 9JI(HDIRK߼5S׹Rf j\;hŅy8Ym|o6/y+BKtZjVq(lE@(ދ4/X""!*fEDD!솱X!B(Hb$ǩ.JH߭8m:LmU|nHL4KR`fo:W']k&ٮ(`(U3xy?fmu\(QaIe!j#8T6 fG+Y!yn= AQ "3F0DR&"ElRkIaXbV/m^RBXF !,D`@`(+ 4"Fi ]Yj3A#:{IVm֋ Z-ޥe$e10XyuF6w 4n占,|`m'osF˵HFgZ %:&}li-VN)aD\4]kUT1z 014S*oΡYnevBk3rtGGf?Gn.pđd-SX qh(^/A&95fz2 ׃$slit59AI[&G㟏?>s raAm"U &RHҽ$!MNbӰ2Hôs\]Vq}d{E"=^%Nb_r,Sٿ?'th_5PL#ۧ&SXLwTmYgL:ֲߕKWjS}T"KTx@ej <5,,!.H(RLMC0Iq*$.!k]+gÜ<#b "1_8G9`LT!ݾ)p(B*+^ K gCKېjtto,!<>Q/թ5*6dkUӺE24x=//vԙ1|of/g*P$?F?y9HωXI]}%Nu{tOh=8g.@}o-_<aOIV+sh x8MS]o 9w9+d%*KPD\Z?2Y̦P [>k DqBR#t$T"F<}\*+ }9!L/wc$UÒs%IhRHC1Da"b!CB!L)Y-ܗynWe_'WԔ]M!\ɥvh4%jh  QF뾄VX BN KoBcpsߞރ|&KEA}{rRvsFgjS@z@yA_Uo" .E 9'6H) 9H7 !ͱB2ծڮZnWD[-JVB2kۗv4Yl s0PGvGA2-{:$?Ol:;`Z&zC"褐z;'ڌSPiQAm+nr?9fɱ97mMuꧧYs sݶf/ Xh@kk:'b(OsP?lq0ogNCfYtKx xn'O'4v8a=ڿWtƩ%d[?bWpl)Hw,J3$=,z b^+{f `OS` ۙ@ :zy⚎{eXSxaßέsi bHg|d*NWu'֞mm{R%s}gpj O "s W,4+ڽbAc-SIr XwXXui 䠠rn%vG?k#'):fuBho<=^w}EVg`| Nnt0.pw 4~[x‘V?>T7aWh~T ז/ߊ|`J*ӑ> sBz&Ѹn]4L[5r2 UV!1ҡOE6#(`6%FEݵ02-'6bJ O@s+W"$qHI&cDq# gl5*h י#+NΩ\9~ľϜ"/; UZ&b055rih+J&\5tg.PҠT&BXCÆK]9@v^#+ZsLm>r^wEDH8r~Jᢿ_Yb֊w\5 d릚&XM%%+F<@C菵-OLcUQ:oKcmcp5H铒qwDwEٸ$~ ":JiJu$gۇ,@h %NCDI(dƉ&:T&q mN >o`Smd4-MnF7[p@#A# wd156Ha@2 H846О݌W鉛JKup$&V aDvj!b$t,15ك>s܎,wo#n݋W,^L2Û"+E]~~6s ]&Xif @l oW}k0嚅wևM>pvI3V_ֆ;5PWm<&c ,%bS2#W=mq r5OG-tK@@@ngeMGUTR>D EGh`M)hϮ" HeЊGVen&_+!5nܗ[H*P,ZI{L:nKw>_ͫ(Y:Sc/]{R$'옆? ]8#\=[=<5p!Ѝi|X#sg*#(5/鶟sRzmrN'eFpSƝjw>;ɓ,ZrC)Z&#5HecQVr|`JLGt\uJrcS-1k/$v6G NY/OzEoyy?*#3;eF/ |b%ZJdx0/-ѩr8 QХZ_LvO'#Cqrjs 4*D1BDBcc,vyEy}^%d #l`t'\d?ŋx԰fM k2FU),Q jf bTIh?w&uB&Ε%j=5y4)#XzB8q*М!>Oo}X~<YuO)W*`.i=1PL KpZC<2hi. DZxxӡ U_1EO dYJJI'<: vAS_TVI"g,]2'.w̃.`9n}VCܠ*ƻ/NAJwN9! RG{30@Z86 />KvSSP~ONDJdwTPHA-?O]t*X6\i]r톐"qV&Qd 8ƣozlvۘnn!')#lwn)AHWҳVy ק;A/@U]v,rAXYոMQts;ZyXӾV,rRFZPp1ً#ٰd_F2Rde @D8"D /knyKnn#7ww4 w0Y睹!kًA$RQGr@ jP)ìwl4yQ*x0Q| s>|JuF~>j%QDa}f1Z^hOƉJO:䓸SRfiC1Э*kiŪ8WjUY+#+6\eCR,>ٙjaxõ3kr*?W9ݡ2B6BJ& 9}̕5e6,@~7;[gɲǽ㷙yzeJ+-n 6"XĪ'pg~=M@D"XbLjSSk̺h" ňIj;KvC+W܎d4wʾUϥ_-zu`VsW¤o#Oɽhť_S0BSWy^5cmk"Snwzh[ok|Vt;T9%QBTc 붋 mbR"FT%IB58f08ƒQMHd"R$2"*i4)ȭIJ8`G 8JbD1V[A.w9gHI͸tb h)M"9P10RUQ mrA^.t0l8!WJ)EB#ÙL K#01,c…D2 k%vkzjඳ&r_VH`y9֭֕{*[U:R>yym&3ny?`>ܤ}=YH=nR6xw~W}5MVOnS8HbĶwux=1C9c| J8cv:_f OB@g7]t\N~~ϾX1ehJhz[P|1nԠb*Q/`WuJ;XըB~?֮nUbO`Hs'Ek6KdDjd5rkI0EH; 7JP MtTT2c}tXx-wU@#1,H 8]$Pj%PȎ".O&g'z=Ql뙣rC3Wdxڙ<.9FT(c$Xzn>In}9ڲrfan!)6}g2޹ε}ٝq; |~kht &48 H!v+=r辳8iх nQy3CZ[%^fjMܗxb-"Z{Ws0rw#J')Z$'Ɛ 1ھȈ$a[8)^Js:*1؆0MEz|C8sC8KO af%ޭ2qɊmĠÌst;B $%I9p=\Q+A^gK8Ϋ <? Bt1_HLN=SPa@DaKTlt'Vw~weF?.smn8w2f3qyp]!3]aΪnKX|:=wzn؋GٱO*%@SB$rD 4Ezwzݬ__V!!"%q,c"DIM<&Nh\7Li!ͥZiô?LG mec$ŷJ*%n&]C+@M}9}Ic]bTl#z>L:$]P"XjhD1) "u1I $Fk#@ĮB*QT6{Sh$JƐjdS!0JH XkJƮ6D5q adR(:Z c$Æu#-M"bRb?*R!厞|iR RƯ5~!w1wk^PHVԑSO)) zcj4 Vl%b&l%Ema~9tlV`V2mDVr^Ľ Qrx!0sD61}ۃ~[ƧHpgmQ(9-}ֶK a8w"e '5hڵ/F'!t> T6DcNf(eN<V͐}} t3atuYW\W;~!?Dս~YӀکLPBCm$an Y [>؇Y1vE!L;LG\VN"y.d1Щr_'޲n_Ns. X?~MTJ=qΦ?Mݴ#=zK[΄,t턼6Ц^Sbɮ5[:IgR\ڿOV_gG6ү,+/jGrnb?\^:0~~*yAc)E k:(ƈ%2FOj4X/GȆ9K3Z.P@2`_wY !Lot|7B'8:.] E>E2k:2o5y[V_8UT5I;fviUg;8:# @+KYţ7G}Dx%W 28.%U Aa! "@cn?23vtG%E/ᖏHrDZq.=6\pV,}[ᶽ*W9,EϘ_lNwаu$B=+X{S~{4f꽼K)z!9M/DE9Py([nM*iusCNTݺuJ4ezv?5P ({Ia]{;rOĬ\,|oszGp{J8^`}/6 XGki'$xY=.[tnQ˨d: I6h)Db{-5hk-Iо+*ג4qhw 28~Mf>,mh F6đ QAqN‚#=ͱV BB‰CPP EA!aNz#դp@(h"x9 Mn0}W9{Ac+F}=1B&1Lj}W&ݚX.i1^Nuřr7!zofޗ b2 4P*@8! NrB2)2cJH")MPYFQNrTF3+hatXAOfN>L2ҫ¨(Y6=*9,^}:/o>9֏o/7:f:n}ƃlH>#H Kͥ, 0QR TDCD2qrib %b #=I;]u2̉ Bc7OsGٿ'y]݄DQ֦~g%Jq0"KXUba̩d"4W8%˳Hx: $A[IֺA˃gG3B`cƌAlfǩT$BS5%J8I$R!&,y6Y:" pr }aB^ήhBWvYOmQmgojT[5>Uzo4u]0 v}븏V^a>MP8Ͳǁvg| WGy.pidl*ҡ):RiE_v'@ߝVN[-pg{[D'H}-CMi BpBcctGƼʼnH#L $DƚC ;ZYC8ke=TԐXf EJ{MJL~L$Il $(h^H2[KZ$T1xKj0_n)/s g'p2מ:~0;͗AVOm./sco+U7r\oȼءBxeyv Tsvj sA1MSƀ.p=m{-mc$_:rZMo 4b*wGf*O4!n\=HZtۋ.77V~>"[ҪF6,B}$5HZ3;[UzAU.@C)׺z/?t ]lH%`i(c~6}jq5:-ߒЬ0_zfƹ!Gߏ_%Bdu7_a ̕NCY\}}J_G١4)@c'wx3;z 7 b!̍.W@nQ2ϓ߮OP] ].ph ?GdҮu`l؏-ȼJ LoNVa!!iu. ĤogaxQ7_ް}0L]n+J  :PX# D Sս<_:bt^*j\ޑщJ{If[wD]Љ[{kuİٵZ90kMm"xYQԍ%s'ywwXMKnխZ.Kt>0:TvyʚzFUV 9l&PݲЙG@wx:[{zO'^!Hz+N{ĩ^qMVy5uٜ ޔdOh$t+86D״5&8"8hdSS9F\قڦÓ/ٳI|ӧ\0ڛ[>\nvnK8&`D(`_,C \]zBZˢ+);,/-^1"#H8; oqfmc]NIV`g?ێ0<f)qf|W2wowI!mUs= ଐ%n - v~oR5XO62?֥lyq tbƧA?<w6kM̀Ս%dH9;1ra#6BӿX`ƌ%<5'"au9" f>fƑ3pylN̓d1{f w[1x33)^ψO1wY&H9EӖ,w!{yN"3znW"ȖP҃_b|3lbn1c͟l(Rt1IP BGѵN*sl-qLQY(@ۚS*޼^V1;YW JfLSRGo}X#җ?^ CЫ>}1gkM뜍F/fpվ,G2I:dXI~pr7+x-ExOy$oWQ!QӻڼEN" (OYEوrXb8T4J-Iwa!1wAp\t~CUԗ띃Iߦ36\>7 pܪRIkrZn q ;ujB/ޫupŔDjcW=ϧnc&3 vc} ̘(JIG `bMȐp%Md'Bߍ̢/mgo߼LRh, ొ""`LH$X33hx$ve5 4b=^o3c<p#{rj4Tn(M_Ղc."Aދ>Ni  {&K%lg:!Gbx/>‹]bD/Գ]`yq^LN^2{M=MC!7 6}jq5:- xT/uI1j? ~N߿QaO~ ̕N)}3HLe w/]9^" I5ߤ6iխSZ.K>4y^2w.l X޳Ov3tn]ܑݒ˹왍ݞpaG4$¨ =;c>/xQ7= ;YZCErtA(^DjS˕/+澛Q}ҼqS}D4r/Wj/J2H~*?BeM=RY#M* )Hj2ˆN=˧fU.Ŀ]60Cfr>CimlJ`V =3$5Hh-r6 lڨEd^z6*7fQ)qUC`P F{spŞ.fﴹ+<7{:ٛq`WHϮX8aҐiuWHI;{MBAEC]!!~ou)udϯnR(l/}|r$;uh?m&J͛2 pܴ$ł[![׵o1xAK:4"zjjYO)sMI|CnS{w$$}e~s~\M}gG}x<#-S)'ِzeJWfDn?#_Ќx٠0'J։iE*)2Б؞I1۝.'Z޳>FZQz΋h4UfWȮ&c c($&0ΤTD Y@HB5Raun( 1bgb7F2(FǫLgީTU KB_Mٳ|u,a p3*M?6Zm?HO_ְI ;\h֚WO&8.?|1O-C Y#Jh)DfE;h)n>K +֖MNjqt+PBI!Wr76mK'2(BC©4c.Bv7:׌︨(Gc lŷ">Zܛ 97kEPMsJma>du3{pi7&.toQ۟ptqxyi= !Rn REߖN tܐ,TGB Fvd7bQ39GbO Z$~ܷǣ@y(e "4'lHd1[~L-~r.adh22JJ$_ȍ_ B*Otdu@Φ}:i;U#N06aGm4 H'+)w\w9ZWVe)nqx;e'+ ~zu{qbۣ5Q%P=A l{sK/T݌zl2(f?l+a N lb1P=S5&`#(%]"gfC%;` ,Úu7v89K Nn+pN,cǮp_9P/'QaOQCuF㈷g':/e nty^7`pc1W0/Rs?{㸍=n/aOlp f0٧,Pƺm-dpp!%-۲%[$wMYb aE9bS'Aa%۠Za[ wOa>Evzg3ʸ̥vMYz@Œ7Qyҟ X }f\::5jL4|Q,c!1җ )8`ֽf&@C(dL EJ)aYڨ_,%7cvj I~RP QD!%,@q)CdQ9STS@ՇySF\|9K/[s%KyS%HxtFCp@o AG)ߗ2V4)X??@Lq eQl,ia)$X ߐBɊXʑқݮ]2dOH1++mW#@*F,<߫ڱj|1nnL3&û0)hP: yYRw3[mjW7+5>U+|*F_^)2򩨝,9 0d'7:/ndʷqI1p{j9PSwz(N-BCĵi1KH:fjVQ a뿖3ƥHeZg>6&2{PQz jCҨ+)D~mv۷m5s ?iUV:&D-*1;yUbc #_i9OBSq+oujNdzv_&Z01%8'RyuX'*9I #YDoyğzI[wgza{Tze.9$ 0G5@ ?|c:<3b` frBv7>Wd(&Flh*b1"e,cA/o:ISĆApQ1{#%Uix{ID.Vj( tA‘8)3V0O?Ӫqsg=7 4Pj7+y+6$ۼ:M*w@ ;{i_gZX?3mz*@pU=&/v|b⸔#c޹qRFfXYHӋzyU=?~~J_&_f/,kqi0b"&W:wd(Ȧm#C?!@T ن3mޛm6(ޏW8O_//<0w16fR| !@.2_ke:iqPr FJj[R@^Zvݼ2<1C1 B):-YU5rKֶzӈ!mfEǕ6IdלNV+\&qB6G*wmQ$46K\ wVٺMq$ 1cf`HlVMl62^yO:aI(lzwu@+un&75co V/kt2и ^yx]^76C Z/{U މw9B&@7aytE|$*KeHfׂ.;5Fg;6c*XaIS{,1V#댕^ZX:?|CG.OZ#u?Vɖ,Oza' tQZaGw;?+Ⱥ]{`Bh>N4[oe兹z/m-4S9u)v$wչ wק )mPb'"Q4i*T aqޕ'L6f2_ =fN~R+@F2@q}1T,x9fQ OB1 X~I1{cpB.*KJHp 1(@G!"WDpҊ)qR= S%Mꘟv`}/dL5c*B<~1A-ΨSgհ#8at"$ᾯ ,+b H"!Hc1rf:]tjḒ &Tj)8n4#Bޚ3#`LDz0:c'PO( 8Ϙ QTM ZӽdQ^< JS yY7>MMk/ZgS|1Iw߿Gr TK~E,&a֏@}~cJ6|\OZn-gLU7}Zf`A 3D`&0O{/rGX00b}3?T?7Dax}I!=^.@PL&ʉUĒO'.ީ_0"(%]خfASsSÈ.Mr$s]4 za]9睈${j]bS'gf: pqka]؝9>$~&D g : pgt ࣃs@Zuw6ݫy87+5>[}sw//~sHҎ1whќ$a ").;/`(`Bsit֍Y܍sP(fB(̃- A/ZP;:BDQ?Z!A1?;OQ;n޼jE&vD s [\ڠh;yrcw1hT sRmxRNrZܺqJq*%9״E׭A&6nyy/넫ëuGWcp ;ptsݴ3Cj`~rORY¢t5;jPSv=]I̿ğ%Z7-'nB|%Wƃ/dN 5(@y VO/%mRS=NQIZM^h4oEQfZt0 uQ@ ƭӒ B H 2b,3 =.pr.#U|IHy7♲o@%eE!N+ڻ3KEsO";vJUi²5w8a,լQF1;cngHR+,1ؠ,xîO[n >h m`h-#pDwPi`MOG|^]v?p*rwc&K8Rr܅\n + h-=M[>KQDȵCX?#YS: )Q׻-R2uS_FE3-UEppY$A+}?Gh[#1;yMIH$#P B" %JM5#@|hb%E R,$!_a@@@ % QG⊐b $^&;@w{/&gj2UBc?)k?&ɽ޺A-Y׺xLnR "Ab 1bl021J3S"d6Tmd'saKs79-_ziT8ƃNxgc*cTŻ|w`>z 0^ՓT Kj̠,QKJ(ug->|2 Є Kq+oǾZlw/=˴}ISW؟LKb'~~NŻ;;gO ~E%oꇙ6w`p2"5Vx̣: ܵWxTqx`#5W+@:1 !M)0TVTj 򦳟j=Ll4A\H)d.ϛ\ҘG1SJh$TxZ{*Gʐш<ҟ"l ;&~*8œ9 ?%KhlΛeC`3U7h9l!۟Ӵzn?ldBwTWMJxD?^dt4?L_J+Mg?g3toxJ GHQF.Ǣe KIŁ!+ͨڢtȻ D6̿͊wu ȇf~\0#h}\S\# =y/Y ?y>vh%4&.4-snn_UOv9װrmlݤp )!yKt R156'9-n88ˌjE :hޱ{^8۠M\6v |zs(IJ!~ғӣuw;T,3**dmR+jGӟM" 6^=_A~C޸&>KKacC-W/o~ܭտ'Sez-}E8]!\mI|W&x<=VۘgӁv8D# 5~* T*1?>h!jk~Z=4v2\ NǻN!܎CNqm$˿"CNi}'Abe-IF"%-(R8:(G{Z߁-ӭI6DvܶKI:*yt}zJڷIiH3N)H fze  ;J(AڥB"vh*Ju&mX< r4-f3yEV;Y[]-a)MUZgDK^kGT#Aю5=@%dס(& :@|! oU"erzSN)FB) bAdaj4hlAek|Ե=ei~I.]K@a ѴJ1H|AM*1b[Yn͹b"ɣ0kILxeŋKqyw=IbucBʑXl۫ѝ'#; qdišT|72_9*QK݅X zx3ad28pC$s<W(DAy8L; W-NNtZ=h|SN,#iDGF 4ZjnΆB9*ԯ'u[o?ࣳ|GQN0<ba=S?aof|*<B]Y x4<qX>ipǙQ%Z0$%25K ɚg̛ӱԚG.چf-( V2맪{T Rkc4 A!5C≸Ӣvϑ ! SLv΍# $n.H6QP)!nYo.'z||> .GisFi. Pr|zFt ?,N-,]qvSs;>>8Nu=|)@4:Mx$y9ȼC_@Bzp<]|)]p JxѨ \5yo333N' * as L y\j_]=*l / _:{^Mza]ëbzS3aގ^'.sO6lE´o4]_Z4e)ɉ2shͮ:ffn<[_TϘ⌿Cr^5DqP =md(+<7Y&5ޥ ՊoBTM<\w~ @N QgRrId yipqS 4{@h¢=xhRw!I7`~WDUmꞻ.6Iųy)%%X}JN_̿?3&=N>.zAF#ki0Lg߰#!ů{Va$W h&$]W7¸hW'17"KW@1JOK(R,INW7j|s#:~!ĵLk[;Sԛ.oNճ~(G?8vrF#+qTr,D*[U*Ҁ4$BP-gU,Gg1H1љVx95ASca9u,ͶN"r2#DqEgU)eOp~y LU Xh0E0AWu{Хʲ7?eH`E(,^E. Aa4z2WR)KAz<{q?ʮϞ}1pu#V"S9- cXo9)θF!)`C2IPoU%A@<)OleksH"#V&lڈxTVrpjklLr!hs"FFճ(4rcsrxjfF*KHEG1f| Ĩ C)"ji<*Rbkw-+p6T+!&/uƑ8㡄6XmX$4aȥ ȷLv&\Ƶ}<ÚGVNӇqFli?^6 !u=yvP߮"䴬 _-྾,JnE@FEʭ$gkpu-rG`J`;Qy\D.ߗp?np6QRJgZx GȲ axr1˅v*3 2| Ga ̙Lp#jCt0ѡW͊y#[#a.9- ͍x 9!ɚ!ТƯ`$ !y51f+IjNq$QL&(8uW8{|y&K+1<:3AqIGH{.1VǮC0[[`"aRO"wc-o{\b['r\6l=rrE-g[c!lg&>醷B~RCQ"$R8T# nSItck1c@@YejN@[-":ж(뢹9Jq35 2,hjC09e%ʥK%z%ZIʒOy>31包 wc Dqy3l_f}XSd\uИ:c{8QX[k3#5)7 ;,Rc41N[ЄY- @h8C|'oSr[Zx,ٮ,\Tz͒{s/L_r)!7Dm{@3v[~p,6lz. #ιʹJIK3b\Q>E$f19T)F<[ .Ss(~6?סryW v)+pYD'`Z;%5ZNi(.Z q#}j)V,"LJd !Z80 U)LD4s)qSf4Z Wj\ޫ9.Ѳae5;U(WXyna2"N9Nu'g<, #Vզw A(hW1 WLp̱d@$c,%Ӂ!L-vX x֐I,1K(*h?<hIμ :";S<XFI Lr3*GJ$%6Xu:EP%C: R *E˴Ԃ@~ #ŷxHӐq_u֦E!*tKNw1OW֬4>to$T)xӆI)grUyIiXLC rZ_e4QW^>A5nmtf Hnex4 KBܚabzK=Sw#A];Dwljn\>>u񺩹>6&/ɭ$Y{LE2)nunƤwIl9Ʈ6"u̶Я>8|sg=u~fWqt#խ͇肣m9kٺO9ڱP4=SG&෸ܵ5KJ?Յ3JKH[E7/Kl8A>͍V2#l|!6e\y(Z*T2Uu]gG?nDb4>SirX'f?$˸S*L^cPB-M PPu=!D%1h6~(+[-◢_UįX۔䋺_Ή?MN_._XP4-?M#޹Զ.T6D4x2OgOj̨Ҫ1@PJYI@~ '!83/f>:v GQVA~lϜܜ8eJHlBӽ*&ޕ)f'HM)(t^RVS!m 6Z*d\4(yD1arK)O(&- kQ#6=k=d{iSB¿$_X*` 8$[ >`ƌmʌ\*Y% Tq" c)QJ5 In5lu+F,afJHyDjXD2dܣ4e.laꅧ܀pY.~*!HV?YU2eS8fo RvzM6GH,m$X$Ml!ncb+2E+LrS%Iy0IMh1sE,|w/ԋ\)yAD CyJT1_?3坈f{59+}nH }4*mql$9~[Ij dl(RCW$AI IELO{%5&rIW(LcYh,\q9c?\Cqߩ(`&|5erQEd/!ZK)c$Q vLPGL( _{w2QIvCTwVH椴ۿL0ߏ7Wf IrO\ErFB@0 _'G`YNezgYϟgl:?lF˟m?Z|c=FɕD3CzRRO3Mc))ۏK .GPG`d>xYN¿OJǣ^;z9{s朶?ퟲovS:Ig6[E=zC7 eқoz( Ć^v(^x~<ᬳ^҅;AykiNerҡFU:ՃJ}4Or )k(ѻ7/;휾yzҔrMdTǿuN_9ۣshh͛w]4jUŇ'&? _vWo\<9QOK-7" }0 !t$^[;޺le$WVNn]?d0/_CЋNzݯt Bj$GWׇ7L9ɯAksLdv\e?lЍqe{w#ƤԀ|94o(a0_=tjUģxr 8=Ξ]+smI8Ob܉S&R~/\Aהi~E8 -1!!6@o(HLGD1%H8VP":<f hkz duq 8oqT Ig'v1Lœ>ΫJGC$dr`Γ+WK z@%{2&߀(׭@bfn _!,)Lͷ' WR?k'8m-k&|{@& I(0N7:aF\1zOyE=Rc^a;T+)h ` l@` U1 B 0(Lغ] "x5+'vz\䁊 gYZ}(OG5{a p0 ނj 8\>9bI)9&Zbg)VC.iXT|frrwi 9&dqS鑍͸;f9i|ac kM=$Rl0E0ʫK r'RA3ObB5I8{SXJPYGVnc O*FEjܡ(]@Xje J! b$ #qmH@CDE"҈4b6!'aC%Q&Cjr. 0,bvLubdL,s &1W3T:.AMJXBQ. IIJ sR 'bq0 si#MUXr2͔^Ҟd 1&(Z+k"&){51mJI!%@/' XxɎ kѨ9>?tӊ<w+MQK.2|#Rd;ڌxtU.V6גTyZ-PE7X$!lDT^j=3ц"Mcw P/jeP͈m<9m-5ro@̺-0RCwX|PaL O<&d)ɇT `ޔoxfW%">DXn]q(:_4P=3gLV8Q+Guܕ\}.,|{Cuϐ: -_)Sޢ$^MoJcoΉMQ$)cDx21cOi<+tx |zhQ[o6@ }w`2^lI"#돻QQҎ&V', qӟ=|돲#D*[Tc&(i!e0T>f2RA`K$oN;n*9!O  …AZ|xoӜaO-V'v1myӰ"QUcOrZNRBC L]͡Y,7|pi"Bĥ" T+&. A#E&sT 4{ūM5av") l,md'nCTw 55XHJavoW`M26R!CBkBjsD%"eF$ D2$Q_j(l=wu>yBq`yPmcQH**U@ ,P483h#0Uca{`ƂcRY7 YRD`ɘ(Gu)6xmWX6 cscq {*8Ae0Ae;S I.ex=ńлOn!Hwk(g〮e$0s J*`ݚ񲀵+2ɰ( J`)h6,ɈDĂ`@35]`g&}BI@etezV/E%ϟl~⵬*=_@n}ڀvڪk@縖 bE Q.`)'*M(W~xnCuZCRڇ򢂵HC :ZYrnjCSp͑m <q(<rl! W8*۫XJvQ˄kYM'[ Ra^#Ԗa ,f?`ު y 1Soő,V6D<0E -~Xy+cUg1b/w*в.}N|p/m/movxuWTcA+9WLOˀ+Y\t- rl3\~2e?A3L, +VW%75rZ]vNysSɍupWt.QGbs ʊB@Ny%njBX0ZoS7S%89 L ! w t7k u4 ۅPRQR>’sqU: pܕ8;h֮N9%ы[\ La`luד9DD54Ssw7UJ.(`_(!H4nn_T.h'ZL7nsr0>DkW\SI_縘Au P3ڪA}<\ NFC5j A)gJ/T $eZA-X Ô NU\+"wm`&*]EEx\X6۪t 4NVlr>QiRrmJe:ݪp.>{t[O6iДDwh1f@Xih$!R1S<4TJ(\ kbCa"l93wIƔ/\{T*cՂT2f\!4AH C:t$p# k."M&FV;7ɿ"̗vƬ"lb]8H6{_lvV,ʌl'WlZ3={z5Hzl38/_߫@*2>,f f|E"DKeIѕ*×0JLa>u(mYD'}]^zc(@_II E$#%{#=G5PӳMm}8N] ǗKƇH _'wx߮?,NKΘ {wg ,ե6z7˪ٿO{+imtƊ k Q heU؅uhTΎN[Oo_-jŔ&}b!@&>_c/Vm=>`!ǟ]uwXT-0A{Ț>H9ZOnԌ yvR@j{}y\WyX-|9G6Xuuvv߇xZ֌;QBg(z&ڹgfԧup;5S^gf/?^55>wnlF0N.W A-?/J`܋B+r !j1߲m|{ֈtl׈tw%cUN4ڍFXeN!!`~o<;1ƺE;$9< 1"VB TH`/4;zEIJCmD]UD.#Z:Ua`&qaU 4O3~R4sIl|M(I R1Dwf]:ѽؖ>W_Ď2}f4]5oX1~kNx~avsk`q~I(V4wJsA0F[Cs%lSpsTuIΛ Ǒfd&]_m9ݲ 'q4\zg]-2^(UfX%$uv/͈ `y~ a҃L6ԣ7PES#h:Z5L#E1}젱HpMwTSuYhy{bWv_6gշ5^8H; 5Y(oowg7չ9i7oai\{“ I:gqHgRS`Zc'E@4pIK|V5^H[g7 `tDKnOm<'w<27VBV'q,~ͻ񺮰K)uloQfע O|pYФBl2oWy"˽3 "zT 2NY sHTԢd\":'~N\ C Q>{ R {޳;e\yaNh77&ҊC8m誜;SU~ ORgm`{rѷaBP3K p$ɦFG:PNl;Xd-=M;4F=2';!*[" -\(z/~"`  An%,ROt`p9­ȒeI޳,|7Kb_lMo\+ԘTi'IE@2'ڕz,7<)Еtݓ79-QO|4(*T`[DeUX]JJ/hOlU$|ʢX =:hYjHf߀O}.$ɬ:NU)8eV73*a.maF% #XxeUQVֳgQy`}7_ruc(M#nd M/ 8Aw$.ځ*@ƞy~MpkÍ"\1s̩ɰ  lSb~ -UFUZAU DPUβ(j /bJGRbSb ʙ~Mot#$PNkڧ{#S0>O(xOœ$XSSڹ,ES7y0 3$6I0Z9L1VYJ7T:LŻe-` 7L=u6uk뇰;9=pD%r Ov(: `d$nᣎL/AϿO'm\,C #wJ9v^?6swl(ʳ*P B""y95A;UA|IwA!XiB78$,>n{!B`fy|*}56*t!h_Y6M`˾]z ܨ_H;_@˾T4)r6PQGuřtj>?{Fu3ʝP̹%= /]Xy?]Tu<1/s<_G8?_?Ƕc,~83-Xj_uuC.iRVW=tY>t^MV:eq Q9sx²(!ڷhRV 7z{KVkNmcigc7WF8{TT%AЬYVz-N!];S3}wۊ]}3jAr`c^2? t` bɢڸvW~nߛ'Qys"xfsj瑨(-]|={/_~p*gW{v^3Z=p@)R1:Fh ^ Y%/$*Tެ˜.&j|B=u,@x:T_YQRPzCNTUcɦ *v;/p dU蜕l0V> +Q S'qu%GcB d{XE  [![ĥRxjh!;ls;d-򵰫 - yI`5qq#F cuƪqdss/w";m$0AjvO\u|V]%$[m 7?SjuqOD#]Ľ$Ղ;p>*:8|B;@u`D>Q@Mm-mm+jV]/[`uƋcHrEpzhЉS;F}_R{nLsqDvF@ILf?m7inӨ3obPRDqt"N4")83]Z$D0֣2:T"Z$a*- Ւz"9`MX!z@Az 5L^B.*f3SaM.,R kDW&˯?K^yW4T Īp>@BT&:魤( TRZ} -JWS*"^XkLE_VN..Y~=jUz )KĔ|߸vQgOR7KL2]L,z[ n]?vIt|W+yJg02s>#"5KeN -oȱd׌[@-@ `;ƙ=R!~v2PG Q$*,̓kmLBAjdQ(=|l~ܤF/jx9]1Q9 vgTR D _{ͨ%)H}m֕J_EwAE I=*rNf.V5Zfawm*X/AV-6j`֦NE ϓ:D X-a5R@d0 Iu}=%h4An˚Rh% h| m\)کTCLJ9] Ș̹yh9a4wV XƝZM JwNzŚk$wiRkWFol.3v=3"^&|Z0@$-KϪ\H=˒)L^ .5AՎ `eRbc@7ZPl%f_9,:XWVy;:Bأ+zW>wjdU c{R`GD qDX#|iB7Y B.A}ecM{Aek*Y:=JF0R|: lP;sV_j6ׁɘqj4P_m}P MaBIBSQ8W>y!}kIeHreDg'gN3/ xh}"/}LE͜Bk%;`O?}]ggw/hNjKI}C@VG643':X `$:@b8HȚ5Pzu,ʪBUz0#9h+ /0NUJRKeYlUǝA8tJka ]XE8փ!KY%g,tV߫`=^=")rG]h@Y ()=O2PIc ۙd}';r։1'Z|=B}cyZ;K#@SmQS)iR_L@w@bJ@k8Wu@^y'3+HxH8DjW?{ڛF_A~ Cnݻ:Mهv#(ds[=`3`lz9qf鮮WWuWW!m-g &ƔD >_#RSX~),I~Jy8#L!G9#Y9c\ BE#p j 69Kd_anŧ}Lqc<)zGp) V pSm tYD0 kt#ı#EX3ug D3XCC@NSPܦ`2%a eG"Bi&F#TE$W^fZ"1 -"%ˆDpcEh\JFe`*^5촷DQ A 5iHDPes,*! NF\=sG9pNEOBng2a8c)[nb L~R0γ$Y2%`|E{'3FB(N6gOPH>T,=ujV,&8- ڀ-E.VN;1F[]PJWб|I0;E,Erݕmr0[NH3 d ȕٵ\qADGY.`H:@!FJ ߙ胔3|s%ffP0!QA"2l@&~e( F/$m%Ԃig&%lb)\G sX`FZEC0q j LFT C.2JTkŚ\ ֵw$!/Wo//I)W(%g$fj[ LN(J+N)l"S.3`:Tp (xӶk1PJ!U=m<9YDzA ) J#BLpl6~3X%$VJql4f\`oHQ*%Lz.)\Nƍc6~Y%6Y!S^I#H5,f/XHˎhj+sJ.ϝe܌ѼjGgۏfՙ@ 4lŬJI n{$t:pgAB!6`|n%/.gLDg~wm| FϜ<kH):³Xcmw[_[%xM.zeܺs $dFpk;*y:iBN06^ kMW[=ḫ H6*&!SC`[>юuwt*Logv|D>qD[AcHqGr٣<S 6ɒBMk߂=|ݚ\LeapbgfܩY~.ywOc'i?ӻ'L'l3#~zMyrMZoT:hwRp{'zNoE$5a-IO7X6E%ߒ펍Et Ɗ<+,-DcP i1olYA2br)Լk+/xYmu7]49dv[ `fs޵9BjJ͞K/PtoEIצf.۴YD[M{p=Ck=d-|+1Kqt )Mhgso>NWfnȑA5FηSs?Rm1SS09话>v:a}e~t4ۏpp#<IVw)mm)rn$IvIΑ86l&a+IB>Hhx).&uMhϲ;rRЎk~<27dNkiti)V·c4 RmnK+r8K FFp*\q$bfq&z)qh$F{ҥBAuuzɼRrR{֟./'Pjv)$Pݵ)؛C|+bUO`iMKlE,4 Y%v:ŤRX4 Ԋo/DU7 j\Ga (YɊEqz:؅Gm&΍je%XH2x2fH .B;j8]GB0ytKȊJ?} UkH N+ T%" HkLT2U+iF-kN,3g#`HDW &QpLzx bZPm: DžN!1 (RTiG S #iDRmxQQs,Ӽ`LH@>&0^ uO%Y XWQ 쨦#\lླྀ{Yn#s486 ?4@X*S0) ݂I#W!:}[*U"09NФz% E3NA?O s#;W"(_f60h{c4 s {+8ib (B^˶Vi%\KWi<_ʵEZ[.;OX׼Wl5Qk ޡ\YQkuF)W4*I@,&B!:ZƓ>RTd)7n.V=D{ +P'*ԛֳt4y NÛw?a+k)d}ȓk07Vʍ}]íu^L@6\ɲ84A"?כ|1q:>*5l#/`X#q'DZ;>8ΰͨ•L@*C`i6Z). ~,UKT.VHdJɧ{e!°'wӆk^$^$^$^TI8kAۚ0!kV\>*aLڪR: ?x^.y}GAq*vmW|P\_]tkqH"+6vJ >og8a=TcjaWyu P9KR/)>_:i0zF%T5jgN[pW'I LW07#+!q'vT.Kldq_XM'10.'=6U цsqK#5}}ٽSChFQzJ-JZ1؛?Ͱǣ2yo:/[ _l̞y=i FBl=rtc7Vjַ2Yoa/[_7`Nx}7 Zt8SCi2J:փckaWVjn0_0Dq-dފ #.dI%'!rK$ jSGHP1& zzR7\'k V\%/K6gO2P!310 5Hp|6uRa"kN:*63rDX~i㰍֢ FQq L%5>_ǽR]nbFYZ_Q+.o&|/ʜ8i/=5'ҒDnEԶZ0)=-1\mh7ezggK 6JԼIE&ԢRe-) v̄Z&=F 9L:LB}m9ҀuY+ʏ3cogw4tӠ+zzf.$5 , ԥ LRVɴ''8^|2؎>'Y'|]FɴŰUR\j΀LߧS sOR)c#<}6|llík;0,|a śIxstv0]hn@n|Y*b.? Og0[]t$Fo^{?B30 mw?>\#p$e1Oʗ>\\|).7ne$nzr^N|2`~s78"٤7>KN__cм-Åw?C ¯=.># aja',5]-@x"g䥚ۀ uʻb֋Ϯ7-.~x>MߵAmN;W^>w 0i bRznƗgfhl+&wի/OOFo`DFGPoEѿ04t;`s~ˠ`0A@Q4<򫧽McgzS{`e[?V]IϢUH`np9b1r0zz4qq%L~' ?4%4}ͦI\Lˤ UwBE7VOXzs\9YU(Jou-+]r6־/Co8 sۓ6̣TU:tkm-o$g}k5z2e.a430W1)Nn<88@΀e:zHhL >Rtޓ6r,W} uWW_%x zuSMR aK{: uhPw"0)ab +N$DLR{kE`=:UsV!2iWqeȬU7gZw&mA>lYv֡g'in@<:zR %1gǮG>}5>{q8 ę˶3O?kjh?m ̅\ znυ~p޻B*9^:Gycwޜ=p6?q{GXd(r[p$-a}m: q_SXwUBj!PuFsp'RqS`ZXPǼ(6E$5EΒ[Em)N Ԇ2̡{*s m+Bt͚6VW&xZi5J֨i5'U c6^qڛFZ3)4.7VqM s'u}QI`]_f& 1 l,%_v"cKW3]x% 6\yvZ9#ZZؠqʪ0 Itj/ɹwjL-L'j[O=LhуA*XUHdR{bDVxm r˂v dּd.H܇/ 憥U?D!ҋɈϟ3*V?_}+JaL__3X!Ok|7.ݐ{ '?%iNgr&ޤ݈Қ__}7N(\Aˋ[7yy/M2J*9݀6py0'OVBlzD[Y/5cHor蜼kYdN%$.5#N NPObVYSl 1Ԋ׾6qS$PY/ H%1{q)25rȴ% ^^4 T2| "`Qڬvm;@gLv'(RO_pK?|_~DR~Œa:x1;-/~M`?>|{f U.b}pR[\Vb5KFO ~$饳օ-Ŵ1Y I‹|ƓdzˋԲi7O*e$Vt(QS-yN-0!q~Ms?<<|**Ԟ^V_AG)HQe+ 7\ J;P=O^Jv^YPw֡h/x}ql=͜yW9a]\P.Nf- d9b()Z (0ԢX6Q/ht!Ҙ e"2ԖL9:.Vi߁^ gNHh8!dP Yԏr*9QU\=gzil@mT]JhoHShMO+4Z&')Z5 RbbufBX@IAHk5\SLnB81 FٕLI{ D*!<S]ZIҽ1m$NF]'ld0mjn{62'mdd*WQ& ؛ZJ6YrQ;{mZ@S'㾃s#wnBm!#uz:m鷏]秇"LwLH \m;aTjeh 4Jmss&csz!A,PiՖg)[Jޒ论yi|x@ zp4K)x:4P)ɇ]C7O1=0 7l-#!94Jw,RjWҮ鸫^K_^].@=qE|H\o4]( J>=?<<aX>U?SQLRC#¹@r=8y]d|Ҭb'/Y׽׭ڍG`Y>]3ov  RimLyHӊg^ ssdֱY]!A[mʉ4T:ZYjFP,(椱27B\X-FȅQ . g4KdRDmPv74+Ym$\ZK@I$ir;+ނ1)]w|FZ]xB[,! <{z65-OQ$._W0(cFZiwz6XJB*A?ӳ9' f=it+e{LvM2N/ |dBIOb>KUgېoMl+1$"`]]|Tjm$7q3xmD̮m==mSskg`Rފ(ȩѽ׶UXԶ28dRrԇ: \%"A3Asܵ$־DK+"*VwƮ rq^u|.Y>*S4ٿuL(OϕLh aӭPVI\Q'!%T*V.y~x?~;cC wƒgF.#hk>gn9+\}3Bp&^dl``ѵWپg pоGIlÆpd&rHp-ao4a>F>?T$܃>Q GG|4bf7&WU_jÛҪlK #چVƍ,\޹0Rc%#흅(YWYr=}U 4nIl~ Cb\Ԃɛr8( _C=|H}4:;} ^-KqWtW=51(A ȑ$%S}sDgAv'ՁNm$= ھW%H"љ%>*~Ƕ%VbuA<Ҳ )0pzQ־zF\ɩR0xؾjjEM5&O1 #7ّ(yAV  ,`cպ֣n3Qiء[ۇȧY{>/ۻ?onFVFD"Fdp #P6'ŕ7tC,Z Zp\uep/_#U_7 #%SAP /\()Զ'k-I 3pV׌?aCε1'5r5*ug Z j[UtWf^y6)j@r#ɩLe2`JfY! /<;*<:m+3]`9X%(vi B:Ġ,.e4#8!@>T<*F 0*2@6):dQbgY5Y=OE?(?OWO~^/2-Bz Ӳcl@C@mZu 3pDZ DBD0C~"oIJ_?-]!𹔐;mi9U;i4 D4X#1o]+pm9VZ1\YZ@KbH]|< JXڱȟd*XWs֕dI! Qs5Um"nP#@VԠu"W.jX) ƨ92W#Oza 꼲\ Nw٧3!;|R(7 R+AXng+ vG@XШ;z{61޵qdBe1UcOgM0yQW,)$8SMJnQ\nREх|^s Ϸ,.3\*"p"Or)akaqmDZ<FJ@w'tfW-NqckPKzøY9Y9C3'&0L,`p네6ޖQ2j8eARYSRBE3vRgb0Ԝ}Fa= ,hf [WI;F$|9˨>VɈnnb6qc[rB(>{Nۘ1{}K}.EQb0/SsI#]6ܔBb6%:)]ލj3fUVHF-V?bt>l|{V>͑Ym@.3_)͗y^- ~P4`Vӯ]Jb ɕINSW)kvEҔS&;èƗ˪%1z.m7@_Fa]?F8KyX07#%?6D;CnX/d̲5G{.ܴ- V;Y~zUWNWu>\d9A *7c (._x Fow>M!2oiKVmz-#z!X7ZsBcz2uqꥂn}rCQ%ujҳ孏U1ZaϾI? Yeގz񧦠zL󀿇˸>j]/'wf=G>K7exK=PSZZ;_j㎸dCyg4 M^c5{: ibru` y"CYJ:L΍l ZeL8,e%]^ތg7'\}pK3j4Sz*MgJ7^pBl: {⚄ r8fU_Z-0ZjvcI0-jsSo/TD(j?7W*:i.E&l^Q~yɊ'S ~m ;#!_ aO z|' VAYf]$% #D(w>&w+_*z6^۷ ¶/YG0%OxoLyqݥ 3̱fa~^#56sQmPQS ZfۉlsRyY$Hc`+q[J*EUp3̲ȼu͹-aT"cb(͸1\,u"('`<–)JSzù.GD16@U}ma=)ravn/zwG*{ X}d:Ӹ:R i q#G49t8Rȱ ljY иdKld{Ra0qKAd/ϗ|)C`D%. ݁0 BN %iȾ@m97e}!2$c#$$$P`EԯYŢqR{ _2K1Č{-X֎ Ke0 o1?傀t1'ֻLR'Yy,1ކhP\ڡ*cEnjrjّU;OQ0I"1'KѓD% ȁxʃvB)rfre8pcN<[.1J>G*DR0HBhM*fDb4!s bH$0.wD;  q)|10SjO@['@(4=r r@KjG}%8'%vTS&KV1p` v?򂛈<(R.t"n&꣣ 'AB"\N1I ~(=iX1c\z=h93[5Z5Rd6BClxvYP!3mRH.iޗ^RWD/ |z˩t۾Y94 Sm.ESe^ w ؋vNoj;*wn'~qBs:rNq!'B53CPe^Y D@\Jz[.rx-ܵBZù,oNQ1p؃c`ly 7 ֕%M uT%4 `^wծF=`Vpc˲ !)_AkhI+>Ls^@TM9)0Y PCcZV7ꃡp>j֠O\HCP9Hh4TV6E!iRBhÆffqa:TQ}Hϴi0 \s]} 3ϲ>x(DbzX$18!-ϸ޶ jmxBv3/zmE#gG SŜzs}ʓUT\M~58W3xv?2IA㓯SKL+}(;V|$|55։@4Ou,ionޯ|+ToyJb<~%viIIII[4W0KB:3딕3_ĔȂds +I<'գj~jXlvݬNG}fTTGJ(5OCAE}j=h{?|gij1{Z**JkURi_.ʆƬiFOAVl\TppC=!Iʬ 褌`R$K#̀S ^p-m䁉}mB`ZퟎIn\Fs]X%l9~Hiʑzkqtp4K^D積9ʱ^0 jRtmD T(%PtPhF$!# ݰ,:Dfq6vp l#Fs]x%|̅Z>Q?ŨBon`h)oc7h+d SEe,@K&Kϵ\5Ąk(dNF` xLfr*>)>rZ4-Ԫӣx\XPo^g_BΛ!歿M?&5kBi#34*F8%/G?qVQɳѥ/1n0#GdUY"&OF+B#͎f)qLIJJ,AoVb_4XwUe:DӯY=g96EbZ|1ah<3f0ZąJ(oܕTtSbލJGv]UӫFJ$7kp6j=[YsKv}rpz+ .4.h/TzR`,ajp t%e8!Ej h9aDhɣ7Z L%f,\yi ԅ- Зllv)_en1xjtrhPfoiف c4LwI Dt\XX߰fSrw$džη|gE*䞤D'~EJzE<%%ra^~<ԖvطgՑ`σQRi;P"^o-v_53;E^ahDžЛ1l]o9W}Y2#x~ .L&idǞؒFGyݲE]'@TwWbu%9PП34ynC:-_aPn7.+#lr7X |mPIC|9MuI'k' s'[ʉ^~M툖_UUR~r?fbG|q6?N?PveJM[ s{K3[IwnWS<#xo+x$BgWq Ab#:﨣ySkHֆ|"$SA$繞Zw R1wn=DpIHw[zɓHֆ|"$SAm=x֜jI!grD 6{k( 3&scHӜjtk%ͩ&^&_vwTɦn{k?"E 4[fwW.(v7SQfR !='%+sw3L'_O&2ʍ0j2#]ԟN(yFTZg:5h,r2/ &B@|VpqŤ/BAFh2A̐ѽ#ܺR"oRG8~ƌƭZd?r*0ĭ!e0̹Ή[l 8%Nc0o?30H$Ƞ\SO9;D#H ;yyan|yk7d6.޳wT,v_^]ջS&-O>cph{!1PWea݌vf祵(Sٲ7Rܼόrq.?j0VکC+`cLuEBEa9r1]..p_׋D|_Bp).b77v4[xNw*s-lRd|LYA[ ޔӻ?TG={*1֬E 9*H((2R RL[ DU{"';_.̰ lDz-@A+(7 -#&\kC&=5.tܓ͖pn vR. @"]ݸr?!+¸U"}j\%G .[JaZ"[b\ 8qڶ\jH(8!@³ ܌4_$m$uv) vJ#BG"y{Y~-jLE7$E",NfXrID8VB ֌y`MGV$hVVdsZ۱kx)JІPNϫI^or\K+ Jdo?; *7 G宷'Eeq]of0Fpk 5ʄ0I$)9aLCjm_~5J!P@|&׆T 8F{y@ne~yA UZeoVa)!>Hjoʹ_2Bxv۝E۪ 2ݦ͑Z-<2ʩK!{YD51 E!)UP!<9HMKzHZlۻ/κq̇ߤowBy*& 0hX&T#pѯ0WU@fZwc)D!yl)I!Q@Whaas4̋<+i᎘Z;eTkK$2h{)D!aQW;j*F]TؓYk#JQq" 7WpX*tPqY-r |5y! +PiV(~@ExİXq!bOsj [Θάi1Uv(_RB+P9IuS|Ab),JᛦBF$ 9hӬD @35iBz*?E#2>e 僡 nt,ͽzl<,OC)k2{yyy=Q cmx;rP^,m -,(RH]j ; 휗{hhC=Fbq'jz'A%[ DGp?1h'$O)Pgf טCZGmȇp@a%qca%QkfX d#8e_!&9Z})(m!mVwA.Tm0Z4fK5N?65)GP ۛB/<zCÃd݃G;p  h/w]kaX$ן#M!&/),7fZ3p?uaKRBոw`|uǜR-%xf;:e{"K'0':@<\\Y+́gmt u甞lѳ䆠rzvHf# zЎ'CHhC1 j9@#X HӂA`*Ph\(B2A,vbI5>AVJaؚqŠ(W\%rMU/A Du1s[bSԂ@0ڌ1?*e!Q˞H)r;1ThZH B6;(p jh+-0#`b3S +PFb(G C,̭L{ƑCSI[ s&)\rY jf"^H@8q$hֱ.KA]VSe`BA}\rg65:'y4*Y7`3 8wq!B=^"cpeU j,C#TWJP\gH0Υ|k$W5,^lo| rbYhgk rtlXͲF3I$o. #C< h8U}L繵 G IwK8``Bx'yNN Cȩ^DDy{O~5v1a'?>G}`}Uf?P8,o8K#`+v_W$Y0$U^@~Ji@|q4pGc}/]/]RMqk@7 4 OCor,K'wHMoFsvaq:kt"Va8J|(!ljq]SOۃ (=S/J K n.Iѩε~2ZJ Q nTTnU.3_Rd4-H6*Y)ES֒|"$SABùn{ڍr{nĈN;hM&XNS87xC`i!PD` $5 E$x3)F&c0':V5(`QJ31@M f9-4qAm 8|P$h V+jdPTH@RH0PȈP%% [aE7ք`Uȶd8GGTJNBDp_%u(özCv7[kj* N5Td#Stt,ALNJ4;*358[.@0+@o_7c?&|V&'x; YXE?N/-j, ۋnH0$7f\[k\H P8EF`fVۻA +.WaL&aZvKoo_L >3L]'.:|# B/WS}]v8"2̫tФ!VD>O"#\V f7*55Et}'͠lJOأ?ɚ/vogefR map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 00:07:40 crc kubenswrapper[4745]: body: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.273760031 +0000 UTC m=+13.811955152,LastTimestamp:2026-03-19 00:07:29.273760031 +0000 UTC m=+13.811955152,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.488198 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156a3d8b85d3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.273824723 +0000 UTC m=+13.812019854,LastTimestamp:2026-03-19 00:07:29.273824723 +0000 UTC m=+13.812019854,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.496573 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1e841b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 00:07:40 crc kubenswrapper[4745]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:40 crc kubenswrapper[4745]: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,LastTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.503449 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1f6243 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,LastTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.510766 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e156a5f1e841b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1e841b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 00:07:40 crc kubenswrapper[4745]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 00:07:40 crc kubenswrapper[4745]: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837106203 +0000 UTC m=+14.375301334,LastTimestamp:2026-03-19 00:07:29.84328673 +0000 UTC m=+14.381481861,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.519269 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e156a5f1f6243\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e156a5f1f6243 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:29.837163075 +0000 UTC m=+14.375358226,LastTimestamp:2026-03-19 00:07:29.843333951 +0000 UTC m=+14.381529072,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.528799 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567e3ab77a7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567e3ab77a7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.176034215 +0000 UTC m=+3.714229346,LastTimestamp:2026-03-19 00:07:30.248836154 +0000 UTC m=+14.787031305,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.535939 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567eec7ed77\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567eec7ed77 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.362448759 +0000 UTC m=+3.900643890,LastTimestamp:2026-03-19 00:07:30.510067505 +0000 UTC m=+15.048262646,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.542919 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e1567ef83aaeb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e1567ef83aaeb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:19.374752491 +0000 UTC m=+3.912947622,LastTimestamp:2026-03-19 00:07:30.523764296 +0000 UTC m=+15.061959467,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.552171 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 00:07:40 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-controller-manager-crc.189e156c91a747d6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 00:07:40 crc kubenswrapper[4745]: body: Mar 19 00:07:40 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:39.274864598 +0000 UTC m=+23.813059769,LastTimestamp:2026-03-19 00:07:39.274864598 +0000 UTC m=+23.813059769,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:40 crc kubenswrapper[4745]: > Mar 19 00:07:40 crc kubenswrapper[4745]: E0319 00:07:40.559308 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156c91a8f5da openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:39.274974682 +0000 UTC m=+23.813169843,LastTimestamp:2026-03-19 00:07:39.274974682 +0000 UTC m=+23.813169843,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:41 crc kubenswrapper[4745]: I0319 00:07:41.086838 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:41 crc kubenswrapper[4745]: W0319 00:07:41.246390 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 00:07:41 crc kubenswrapper[4745]: E0319 00:07:41.246483 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:42 crc kubenswrapper[4745]: I0319 00:07:42.084415 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.086648 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.245932 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249872 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.249964 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:43 crc kubenswrapper[4745]: I0319 00:07:43.250143 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.256593 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.257414 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:43 crc kubenswrapper[4745]: W0319 00:07:43.479386 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:43 crc kubenswrapper[4745]: E0319 00:07:43.479459 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:44 crc kubenswrapper[4745]: I0319 00:07:44.088184 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:45 crc kubenswrapper[4745]: I0319 00:07:45.087088 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:46 crc kubenswrapper[4745]: I0319 00:07:46.085969 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:46 crc kubenswrapper[4745]: E0319 00:07:46.221320 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:47 crc kubenswrapper[4745]: I0319 00:07:47.087283 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.083585 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455503 4745 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer" start-of-body= Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455605 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.455694 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.456135 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458503 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458580 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.458605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.461446 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.462924 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 00:07:48 crc kubenswrapper[4745]: &Event{ObjectMeta:{kube-controller-manager-crc.189e156eb4ddd2c6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer Mar 19 00:07:48 crc kubenswrapper[4745]: body: Mar 19 00:07:48 crc kubenswrapper[4745]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.455576262 +0000 UTC m=+32.993771433,LastTimestamp:2026-03-19 00:07:48.455576262 +0000 UTC m=+32.993771433,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 00:07:48 crc kubenswrapper[4745]: > Mar 19 00:07:48 crc kubenswrapper[4745]: I0319 00:07:48.463008 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1" gracePeriod=30 Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.472387 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156eb4dedce0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": read tcp 192.168.126.11:38688->192.168.126.11:10357: read: connection reset by peer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.455644384 +0000 UTC m=+32.993839555,LastTimestamp:2026-03-19 00:07:48.455644384 +0000 UTC m=+32.993839555,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.481989 4745 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e156eb54ee36e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:48.462986094 +0000 UTC m=+33.001181295,LastTimestamp:2026-03-19 00:07:48.462986094 +0000 UTC m=+33.001181295,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.496867 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15676cef6c8a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15676cef6c8a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.183999114 +0000 UTC m=+1.722194245,LastTimestamp:2026-03-19 00:07:48.490004045 +0000 UTC m=+33.028199206,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.750726 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15677ed785c3\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677ed785c3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.484422595 +0000 UTC m=+2.022617726,LastTimestamp:2026-03-19 00:07:48.74397603 +0000 UTC m=+33.282171211,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:48 crc kubenswrapper[4745]: E0319 00:07:48.767850 4745 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e15677f9d6f78\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e15677f9d6f78 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:07:17.497393016 +0000 UTC m=+2.035588157,LastTimestamp:2026-03-19 00:07:48.759577536 +0000 UTC m=+33.297772677,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.088416 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.137189 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.138987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139057 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.139899 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327538 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327935 4745 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1" exitCode=255 Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.327994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1"} Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.328034 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c"} Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.328144 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:49 crc kubenswrapper[4745]: I0319 00:07:49.329378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.085249 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.257602 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.259628 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.263704 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.264207 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.333501 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.334063 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336412 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" exitCode=255 Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336452 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9"} Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336494 4745 scope.go:117] "RemoveContainer" containerID="7195f8b84311a94509530fca982c585b812e4ba21c18f5e474167c1e34a5daa8" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.336727 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.337959 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:50 crc kubenswrapper[4745]: I0319 00:07:50.338824 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:50 crc kubenswrapper[4745]: E0319 00:07:50.339120 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.083666 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.342135 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.534591 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.534819 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536249 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:51 crc kubenswrapper[4745]: I0319 00:07:51.536349 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:52 crc kubenswrapper[4745]: I0319 00:07:52.085296 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:53 crc kubenswrapper[4745]: I0319 00:07:53.087815 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:54 crc kubenswrapper[4745]: I0319 00:07:54.083361 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.084364 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.108579 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.108829 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.110306 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:55 crc kubenswrapper[4745]: I0319 00:07:55.111103 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:55 crc kubenswrapper[4745]: E0319 00:07:55.111369 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.085399 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:56 crc kubenswrapper[4745]: E0319 00:07:56.222149 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.274456 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.274721 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277502 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.277576 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.282267 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.358475 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.359643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.788212 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.788411 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789734 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.789795 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:56 crc kubenswrapper[4745]: I0319 00:07:56.790525 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:07:56 crc kubenswrapper[4745]: E0319 00:07:56.790734 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.086141 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.263846 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265463 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:07:57 crc kubenswrapper[4745]: I0319 00:07:57.265520 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.269524 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.271321 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:07:57 crc kubenswrapper[4745]: W0319 00:07:57.722715 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 19 00:07:57 crc kubenswrapper[4745]: E0319 00:07:57.722776 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:07:58 crc kubenswrapper[4745]: I0319 00:07:58.087360 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: I0319 00:07:59.086135 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: W0319 00:07:59.801722 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 00:07:59 crc kubenswrapper[4745]: E0319 00:07:59.801784 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:00 crc kubenswrapper[4745]: I0319 00:08:00.087306 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.087226 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.542002 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.542294 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:01 crc kubenswrapper[4745]: I0319 00:08:01.544131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:02 crc kubenswrapper[4745]: I0319 00:08:02.083492 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: I0319 00:08:03.086248 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: W0319 00:08:03.891505 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 00:08:03 crc kubenswrapper[4745]: E0319 00:08:03.892048 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.087415 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.270378 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272547 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.272791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:04 crc kubenswrapper[4745]: I0319 00:08:04.273060 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:04 crc kubenswrapper[4745]: E0319 00:08:04.277509 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:04 crc kubenswrapper[4745]: E0319 00:08:04.277572 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:05 crc kubenswrapper[4745]: I0319 00:08:05.086696 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.085233 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:06 crc kubenswrapper[4745]: E0319 00:08:06.223488 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.553661 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.553946 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555730 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:06 crc kubenswrapper[4745]: I0319 00:08:06.555816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:07 crc kubenswrapper[4745]: I0319 00:08:07.089099 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:07 crc kubenswrapper[4745]: W0319 00:08:07.284274 4745 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:07 crc kubenswrapper[4745]: E0319 00:08:07.284337 4745 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 00:08:08 crc kubenswrapper[4745]: I0319 00:08:08.085814 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.086799 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.137461 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138838 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.138931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:09 crc kubenswrapper[4745]: I0319 00:08:09.139694 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:09 crc kubenswrapper[4745]: E0319 00:08:09.139974 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:10 crc kubenswrapper[4745]: I0319 00:08:10.085017 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.084932 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.278111 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280352 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:11 crc kubenswrapper[4745]: I0319 00:08:11.280392 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:11 crc kubenswrapper[4745]: E0319 00:08:11.284833 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:11 crc kubenswrapper[4745]: E0319 00:08:11.286722 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:12 crc kubenswrapper[4745]: I0319 00:08:12.084092 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:13 crc kubenswrapper[4745]: I0319 00:08:13.086588 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:14 crc kubenswrapper[4745]: I0319 00:08:14.086112 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:15 crc kubenswrapper[4745]: I0319 00:08:15.085224 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:16 crc kubenswrapper[4745]: I0319 00:08:16.085796 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:16 crc kubenswrapper[4745]: E0319 00:08:16.223777 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:17 crc kubenswrapper[4745]: I0319 00:08:17.085405 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.086786 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.285911 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.289856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:18 crc kubenswrapper[4745]: I0319 00:08:18.290008 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:18 crc kubenswrapper[4745]: E0319 00:08:18.295222 4745 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 00:08:18 crc kubenswrapper[4745]: E0319 00:08:18.295774 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 00:08:19 crc kubenswrapper[4745]: I0319 00:08:19.085051 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.101968 4745 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.817995 4745 csr.go:261] certificate signing request csr-lh9q4 is approved, waiting to be issued Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.829750 4745 csr.go:257] certificate signing request csr-lh9q4 is issued Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.860560 4745 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 00:08:20 crc kubenswrapper[4745]: I0319 00:08:20.925282 4745 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 00:08:21 crc kubenswrapper[4745]: I0319 00:08:21.830990 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-24 01:51:09.454847496 +0000 UTC Mar 19 00:08:21 crc kubenswrapper[4745]: I0319 00:08:21.831091 4745 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6721h42m47.623762766s for next certificate rotation Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.137143 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.138942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.139020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.139039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.140126 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.500748 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.502781 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837"} Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.502968 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503854 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503912 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:22 crc kubenswrapper[4745]: I0319 00:08:22.503928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.509717 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.510693 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514376 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" exitCode=255 Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514442 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837"} Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514501 4745 scope.go:117] "RemoveContainer" containerID="dce1182a8dd8ad8765f208ca1007c4e687632c9584acacbeaa44bd9ace9e60f9" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.514803 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.516724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:23 crc kubenswrapper[4745]: I0319 00:08:23.517771 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:23 crc kubenswrapper[4745]: E0319 00:08:23.518091 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:24 crc kubenswrapper[4745]: I0319 00:08:24.519338 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.108460 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.108723 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110327 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110402 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.110424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.111551 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.111988 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.295791 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297392 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.297718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.298028 4745 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.307855 4745 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.308389 4745 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.308436 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.311943 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312065 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.312082 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.328024 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335482 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.335533 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.347457 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.357818 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.370931 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381201 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.381279 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:25Z","lastTransitionTime":"2026-03-19T00:08:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393453 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393599 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.393626 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.493961 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: I0319 00:08:25.507854 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.594457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.694803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.795044 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.895320 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:25 crc kubenswrapper[4745]: E0319 00:08:25.995643 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.096192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.196618 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.225028 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.297550 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.398045 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.498183 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.599336 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.699971 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.789226 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.789395 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.791331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:26 crc kubenswrapper[4745]: I0319 00:08:26.792813 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.793336 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.801006 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:26 crc kubenswrapper[4745]: E0319 00:08:26.902192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.003311 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.103683 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.204374 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.305131 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.405919 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.506042 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.606435 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.708065 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.808572 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:27 crc kubenswrapper[4745]: E0319 00:08:27.909609 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.010420 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.111114 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.211909 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.312190 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.413212 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.514255 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.615045 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.716066 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.816244 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:28 crc kubenswrapper[4745]: E0319 00:08:28.917026 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.017686 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.118916 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.219967 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.320361 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.420731 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.521757 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.622370 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.722489 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.822903 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:29 crc kubenswrapper[4745]: E0319 00:08:29.923845 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.024982 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.125796 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.226705 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.327832 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.428939 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.529595 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.629962 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.730308 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.830931 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:30 crc kubenswrapper[4745]: E0319 00:08:30.931552 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.032570 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.133209 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.233457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.334548 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.434951 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.535030 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.635830 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.736442 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.836908 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:31 crc kubenswrapper[4745]: E0319 00:08:31.937271 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.037475 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.138563 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.239665 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.340125 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.441235 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.541533 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.642333 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.742976 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: I0319 00:08:32.796284 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.843697 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:32 crc kubenswrapper[4745]: E0319 00:08:32.944709 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.045798 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.146576 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.247284 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.348476 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.448868 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.549611 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.650461 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.750620 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.851550 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:33 crc kubenswrapper[4745]: E0319 00:08:33.952172 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.052803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.153546 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.253997 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.354335 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.454470 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.555174 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.655560 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.756279 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.856866 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:34 crc kubenswrapper[4745]: E0319 00:08:34.957930 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.059080 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.160262 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.260732 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.361304 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.461987 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.562803 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.570927 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576404 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.576417 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.587654 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590662 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.590783 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.598965 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602038 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602094 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602116 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.602132 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.610702 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615229 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615283 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615300 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:35 crc kubenswrapper[4745]: I0319 00:08:35.615644 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:35Z","lastTransitionTime":"2026-03-19T00:08:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.626960 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.627139 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.663457 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.763967 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.864484 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:35 crc kubenswrapper[4745]: E0319 00:08:35.965042 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.065425 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.137425 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138885 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:36 crc kubenswrapper[4745]: I0319 00:08:36.138937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.166245 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.225291 4745 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.266813 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.367303 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.467592 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.568549 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.669470 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.770413 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.871300 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:36 crc kubenswrapper[4745]: E0319 00:08:36.971737 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.072688 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.137233 4745 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138457 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.138475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:37 crc kubenswrapper[4745]: I0319 00:08:37.139151 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.139369 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.173322 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.273804 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.374928 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.475061 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.575670 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.676665 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.777500 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.878411 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:37 crc kubenswrapper[4745]: E0319 00:08:37.979208 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.080381 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.180855 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.281332 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.381927 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.482579 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.583019 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.683696 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.784723 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.885108 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:38 crc kubenswrapper[4745]: E0319 00:08:38.985682 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.086453 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.186877 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.287630 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.387775 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.488652 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.589820 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.690188 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.790492 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.891644 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:39 crc kubenswrapper[4745]: E0319 00:08:39.992754 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.093621 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.194066 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.295062 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.395810 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.495956 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.597001 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.698100 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.798851 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.899460 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:40 crc kubenswrapper[4745]: E0319 00:08:40.999627 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.100311 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.200455 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.300795 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.401970 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.502691 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.603414 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.704261 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.805331 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:41 crc kubenswrapper[4745]: E0319 00:08:41.906506 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.007218 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.107721 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.208290 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.308648 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.409804 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.510350 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.611192 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.711460 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.812270 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:42 crc kubenswrapper[4745]: E0319 00:08:42.912643 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.012769 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.113206 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.214055 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.314877 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.415602 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.516661 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.617748 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.717945 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.818766 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:43 crc kubenswrapper[4745]: E0319 00:08:43.919199 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.019796 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.119989 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.220542 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.320908 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.421704 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.522248 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: E0319 00:08:44.623207 4745 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.676364 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727466 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.727506 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831450 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.831646 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935325 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:44 crc kubenswrapper[4745]: I0319 00:08:44.935379 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:44Z","lastTransitionTime":"2026-03-19T00:08:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038552 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038594 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038608 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038627 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.038642 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.112124 4745 apiserver.go:52] "Watching apiserver" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119080 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119555 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-qt5t5","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-ovn-kubernetes/ovnkube-node-w2988","openshift-dns/node-resolver-5xqfc","openshift-multus/network-metrics-daemon-4r5k5","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-node-identity/network-node-identity-vrzqb","openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j","openshift-image-registry/node-ca-xjkg8","openshift-multus/multus-additional-cni-plugins-n8tr6","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-multus/multus-mlwp7","openshift-network-diagnostics/network-check-target-xd92c"] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.119983 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120215 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.120417 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120479 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120607 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120619 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.120717 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.127595 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.127775 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.127948 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.128098 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.128425 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.129316 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130403 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130708 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.130870 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.132042 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.133584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.133996 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134024 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134344 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.134639 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.141657 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.142358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.142649 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143167 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143276 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143398 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143515 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143614 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143731 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143829 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.143972 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144225 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144390 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144502 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144597 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144698 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.144818 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145118 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145255 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145378 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145493 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.145635 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.146484 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.146685 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147078 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147478 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147472 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147530 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.147535 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.149240 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150063 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150161 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.150171 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.162403 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.171426 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.184959 4745 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.185660 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.200923 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.212282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214813 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214907 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214943 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.214978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215014 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215121 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215153 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215187 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215221 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215317 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215385 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215421 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215644 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215676 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215710 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215772 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.215994 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216102 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216136 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216152 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216244 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216288 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216318 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216334 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216426 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216446 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216463 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216512 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216532 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216560 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216584 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216607 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216625 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216720 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216741 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216759 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216781 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216802 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216961 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216979 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.216997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217017 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217033 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217047 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217067 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217083 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217100 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217140 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217157 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217202 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217263 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217283 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217300 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217315 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217333 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217317 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217410 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217608 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217634 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217644 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217706 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.217990 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218559 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218576 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218605 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.218995 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219291 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.219993 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220083 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220124 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220348 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220404 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220659 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220733 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.220865 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221114 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221226 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221262 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221276 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221334 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221391 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221442 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221488 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221535 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221597 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221649 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221699 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221712 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221746 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221751 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221798 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.221993 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222037 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222141 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222153 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222182 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222225 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222289 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222326 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222361 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222338 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222391 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222527 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222639 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222700 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222769 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222809 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222839 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222843 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222902 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222963 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.222998 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223027 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223112 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223146 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223177 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223207 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223237 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223270 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223424 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223460 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223564 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223595 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223631 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223663 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223767 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223836 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223861 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223904 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223928 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223951 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223973 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.223991 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224031 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224051 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224078 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224098 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224136 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224152 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224171 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224189 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224214 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224233 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224273 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224313 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224364 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224393 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224417 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224445 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224498 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224552 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224573 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224612 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224632 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224651 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224680 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224703 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224725 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224744 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224762 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224803 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224821 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224840 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224861 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224905 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224933 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224956 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224975 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.224994 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225012 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225032 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225050 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225144 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225171 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225199 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.225246 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.725212724 +0000 UTC m=+90.263407865 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225294 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225331 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225358 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225384 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225410 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225434 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225466 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225492 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225516 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225542 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225644 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225687 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225788 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225819 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225851 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.225919 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226076 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226110 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226138 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226181 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226209 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226239 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226290 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226320 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226342 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226371 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226402 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226428 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226460 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226488 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226515 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226537 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226560 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226581 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226605 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226655 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226676 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226697 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226723 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226746 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226790 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226839 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226863 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226911 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226940 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226963 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.226986 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227009 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227033 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227055 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227077 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227128 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227209 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227261 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227284 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227398 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227470 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227502 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227533 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227562 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227593 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227621 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227647 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227669 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227706 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227728 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227752 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227829 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227855 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227941 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.227987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228054 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228081 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228175 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228192 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228209 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228225 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228239 4745 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228259 4745 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228273 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228287 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228301 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228315 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228328 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228342 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228355 4745 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228370 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228384 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228398 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228413 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228428 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228442 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228456 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228470 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228486 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228501 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228516 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228529 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228543 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228557 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228573 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228589 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228617 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228644 4745 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228668 4745 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228689 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228708 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228726 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228749 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228766 4745 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228785 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.228980 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229000 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229089 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229109 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229128 4745 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229147 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229755 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229795 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.229858 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230132 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.230361 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231353 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231632 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.231944 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.232576 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.233016 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234389 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234596 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234687 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.234820 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235064 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235261 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.235570 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236116 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236165 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236480 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236571 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.236871 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237100 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237101 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237147 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237150 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237485 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237523 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.237991 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238145 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238172 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238334 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238524 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.238637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239016 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239029 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239035 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239180 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.239862 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240044 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242063 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240062 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240199 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240306 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240311 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.240381 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.241518 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.241951 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242303 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242448 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.242775 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243402 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243459 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.243993 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244397 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244715 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244421 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.244789 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245112 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245354 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245436 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245598 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.245896 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246050 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246105 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.246700 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.247471 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248061 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248181 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248722 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.248817 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.249023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.249072 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.251602 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252022 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252530 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252678 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252957 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.252235 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253348 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253416 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253406 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253520 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.253771 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.253874 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.253936 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.753906977 +0000 UTC m=+90.292102118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.254515 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.254743 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.254946 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255202 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.255760 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256187 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256484 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.256791 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257061 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.257255 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.757211114 +0000 UTC m=+90.295406285 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257626 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.257857 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.258214 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.258303 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259617 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259849 4745 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.259862 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260312 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260696 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261574 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261966 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262342 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262438 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262485 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262806 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.262866 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.263013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261271 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.265369 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.273202 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.276946 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.261828 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.271299 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274875 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277877 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277906 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.267433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.268159 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272175 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.272777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.273030 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274024 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.274587 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.275093 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.275125 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278541 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278566 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.275548 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.260856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.277480 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.278711 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.778650463 +0000 UTC m=+90.316845594 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280556 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280593 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280609 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.280680 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.780658157 +0000 UTC m=+90.318853298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.281007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282483 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282600 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.282678 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.283120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.285455 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.292648 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293167 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293377 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.293477 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294512 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.294994 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.295015 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.295897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.306093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.307606 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.311217 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.318949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.319747 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329105 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329581 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329619 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329737 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-os-release\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329735 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-system-cni-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329777 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329800 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329821 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329841 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329860 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.329857 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-netns\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.330032 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-system-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.330706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332067 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332147 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332167 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332228 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332326 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332345 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332390 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332424 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332477 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332512 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332530 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332546 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332600 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332642 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332659 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332685 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332736 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332770 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332787 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332821 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332839 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332872 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332909 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332951 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.332997 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333028 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333044 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-bin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333069 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333608 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/33ce3f8d-5035-4139-b206-f3c36e53618c-hosts-file\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333635 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333645 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-kubelet\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333706 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333709 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333750 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333819 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333838 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.333863 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333963 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-k8s-cni-cncf-io\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-binary-copy\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.334002 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:45.833907999 +0000 UTC m=+90.372103120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334036 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334059 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334279 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.333690 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-cni-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334536 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-os-release\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334871 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.334947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-conf-dir\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335096 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/400972f4-050f-4f26-b982-ced6f2590c8b-rootfs\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335352 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335475 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/400972f4-050f-4f26-b982-ced6f2590c8b-mcd-auth-proxy-config\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335524 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cnibin\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335550 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335553 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-env-overrides\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335588 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335591 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-hostroot\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335607 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335630 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335583 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9ddaa87b-caf7-46de-b693-96c60909d05e-cnibin\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335926 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-run-multus-certs\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.335968 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336026 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-etc-kubernetes\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336062 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336087 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-host-var-lib-cni-multus\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336613 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/09cb2800-ce49-44cf-89b5-d1e5459299c5-serviceca\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.336631 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/09cb2800-ce49-44cf-89b5-d1e5459299c5-host\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337253 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-daemon-config\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337289 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337449 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337465 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337476 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337486 4745 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337498 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337509 4745 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337521 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337533 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337547 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337558 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337569 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337583 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337597 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337613 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337626 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337640 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337655 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337670 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337697 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337716 4745 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337730 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337744 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337755 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337765 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337776 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337787 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337798 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337807 4745 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337819 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337828 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337839 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337850 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337861 4745 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337870 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337882 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.337907 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340066 4745 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340082 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340468 4745 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340480 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340491 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340082 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-multus-socket-dir-parent\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340543 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340556 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340566 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340576 4745 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.339923 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/400972f4-050f-4f26-b982-ced6f2590c8b-proxy-tls\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340585 4745 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340632 4745 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340643 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340654 4745 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340665 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340676 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340687 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340700 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340709 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340720 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9ddaa87b-caf7-46de-b693-96c60909d05e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340731 4745 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340803 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340821 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340841 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340860 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340926 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340942 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340956 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340972 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340985 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.340999 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341013 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341027 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341044 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341057 4745 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341070 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341084 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341100 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341114 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341129 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341145 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341159 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341174 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341187 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341200 4745 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341214 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341232 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341248 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341264 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341280 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341310 4745 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341331 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341347 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341362 4745 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341379 4745 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341398 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341413 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341429 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341447 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341461 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341477 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341494 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341517 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341541 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341559 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341574 4745 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341588 4745 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341603 4745 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341619 4745 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341634 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341649 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341665 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341680 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341696 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341711 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341728 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341741 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341754 4745 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341772 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341786 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341800 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341813 4745 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341827 4745 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341843 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341857 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341872 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341909 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341924 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341937 4745 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341951 4745 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341967 4745 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341984 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.341997 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342012 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342029 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342049 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342070 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342092 4745 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342110 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342127 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342146 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342168 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342188 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342205 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342223 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342241 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342260 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342278 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342298 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342314 4745 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342328 4745 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342340 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342355 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.342372 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.343196 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-cni-binary-copy\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.346008 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.348627 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.348697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352242 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c4250eab-9d3c-457f-9a78-50400c5f65f3-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352508 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48rc4\" (UniqueName: \"kubernetes.io/projected/33ce3f8d-5035-4139-b206-f3c36e53618c-kube-api-access-48rc4\") pod \"node-resolver-5xqfc\" (UID: \"33ce3f8d-5035-4139-b206-f3c36e53618c\") " pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.352628 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p582k\" (UniqueName: \"kubernetes.io/projected/6a0ae9c0-f19a-4038-be03-0fa6d223ebbf-kube-api-access-p582k\") pod \"multus-mlwp7\" (UID: \"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\") " pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.353272 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxx4n\" (UniqueName: \"kubernetes.io/projected/400972f4-050f-4f26-b982-ced6f2590c8b-kube-api-access-jxx4n\") pod \"machine-config-daemon-qt5t5\" (UID: \"400972f4-050f-4f26-b982-ced6f2590c8b\") " pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.353363 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ftv\" (UniqueName: \"kubernetes.io/projected/c4250eab-9d3c-457f-9a78-50400c5f65f3-kube-api-access-95ftv\") pod \"ovnkube-control-plane-749d76644c-qvl5j\" (UID: \"c4250eab-9d3c-457f-9a78-50400c5f65f3\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.354599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"ovnkube-node-w2988\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.354703 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwl4z\" (UniqueName: \"kubernetes.io/projected/9ddaa87b-caf7-46de-b693-96c60909d05e-kube-api-access-qwl4z\") pod \"multus-additional-cni-plugins-n8tr6\" (UID: \"9ddaa87b-caf7-46de-b693-96c60909d05e\") " pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.361081 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.361264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbtfl\" (UniqueName: \"kubernetes.io/projected/09cb2800-ce49-44cf-89b5-d1e5459299c5-kube-api-access-gbtfl\") pod \"node-ca-xjkg8\" (UID: \"09cb2800-ce49-44cf-89b5-d1e5459299c5\") " pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.365325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7kx7\" (UniqueName: \"kubernetes.io/projected/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-kube-api-access-j7kx7\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.372780 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.381539 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483940 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.483999 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.484020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.484033 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.488536 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.506459 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.513433 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e WatchSource:0}: Error finding container 55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e: Status 404 returned error can't find the container with id 55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.516744 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.523324 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44 WatchSource:0}: Error finding container 74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44: Status 404 returned error can't find the container with id 74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.524549 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.544305 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d WatchSource:0}: Error finding container f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d: Status 404 returned error can't find the container with id f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.548324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-5xqfc" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.558453 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc4250eab_9d3c_457f_9a78_50400c5f65f3.slice/crio-33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd WatchSource:0}: Error finding container 33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd: Status 404 returned error can't find the container with id 33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.570248 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33ce3f8d_5035_4139_b206_f3c36e53618c.slice/crio-4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876 WatchSource:0}: Error finding container 4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876: Status 404 returned error can't find the container with id 4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.576876 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xqfc" event={"ID":"33ce3f8d-5035-4139-b206-f3c36e53618c","Type":"ContainerStarted","Data":"4a57c74bb0019820841aa742d37b2aa0a09fefb850a3631365af5e1d25baf876"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.577825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"33d82be740bc346a2ac510dadefc479fc4b2de8995038ba6d03c4e33bb0019bd"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.578693 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f14165b3d1142a91332943932f4b2d00b2dede8c068833322009828e9146351d"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.579291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xjkg8" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.580384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"74b0ab61cf5fe675ad73845b389f449bbbb658343ca438f2414f49c2ba26dd44"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.587782 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"55ea0674e3b76510e05bafdccd218a77a0ea7c853b2c0c75f5d72a466c8c2f3e"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.587997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.588187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.604819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.612195 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.620308 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mlwp7" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.626259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.653328 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ddaa87b_caf7_46de_b693_96c60909d05e.slice/crio-b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe WatchSource:0}: Error finding container b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe: Status 404 returned error can't find the container with id b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.663345 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066 WatchSource:0}: Error finding container 232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066: Status 404 returned error can't find the container with id 232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066 Mar 19 00:08:45 crc kubenswrapper[4745]: W0319 00:08:45.682678 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a0ae9c0_f19a_4038_be03_0fa6d223ebbf.slice/crio-e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933 WatchSource:0}: Error finding container e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933: Status 404 returned error can't find the container with id e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933 Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.691177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.719542 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.730629 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735787 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.735822 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.751232 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.751583 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.751556768 +0000 UTC m=+91.289751899 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.762523 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777718 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777778 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777820 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.777836 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.788013 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793676 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.793719 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.805088 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810478 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.810534 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.819939 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.820053 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822129 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822168 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822186 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822207 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.822222 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852389 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852694 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.852754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.852907 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.852969 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.852949998 +0000 UTC m=+91.391145129 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853057 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853084 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853097 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853130 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853121544 +0000 UTC m=+91.391316675 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853178 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853209 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853199697 +0000 UTC m=+91.391394838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853256 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853285 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853277009 +0000 UTC m=+91.391472140 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853345 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853359 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853371 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: E0319 00:08:45.853400 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:46.853391153 +0000 UTC m=+91.391586284 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924325 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924363 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924372 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:45 crc kubenswrapper[4745]: I0319 00:08:45.924401 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:45Z","lastTransitionTime":"2026-03-19T00:08:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.026709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.026980 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027062 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.027202 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129472 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129752 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.129852 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.130092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.130349 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.141455 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.142710 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.143936 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.144630 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.145729 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.146396 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.147131 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.148244 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.148938 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.149935 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.150543 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.151663 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.152586 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.152771 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.153289 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.154369 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.156180 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.157227 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.157689 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.158563 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.159658 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.160252 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.161787 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.162341 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.162514 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.163594 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.164321 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.165323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.166608 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.167201 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.168368 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.169115 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.170116 4745 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.170322 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.172510 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.173367 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.173973 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.174579 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.176206 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.183095 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.184198 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.186452 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.187437 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.188773 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.189792 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.191358 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.191769 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.192642 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.193845 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.194655 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.196138 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.197132 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.198740 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.199340 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.199774 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.201216 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.202153 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.202797 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.203816 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.209788 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.218393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.227942 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239074 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239095 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.239144 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.245369 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.256902 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.265474 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.279847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.293643 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.310909 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342337 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.342430 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.445739 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.548711 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.593739 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595018 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" exitCode=0 Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595094 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.595283 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"71f66e30efa5016ee954a4cb19c576186a237cdc85750ce0837af353f57d6b56"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596408 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771" exitCode=0 Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596474 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.596641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerStarted","Data":"b364a57f24f84cb381aa5f730ab98cd9e7f0feb70fc73842d17c5d170af415fe"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.601945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603003 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" event={"ID":"c4250eab-9d3c-457f-9a78-50400c5f65f3","Type":"ContainerStarted","Data":"bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.603779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"e0f38711796ea26c0b48e62abf0504374426cbbf4e9f3e84e32b0853dcb4d933"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607567 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607612 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.607630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"232dec797e251d88e57638877e07098e41d7808375648a38c5a284c7b37f1066"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609505 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjkg8" event={"ID":"09cb2800-ce49-44cf-89b5-d1e5459299c5","Type":"ContainerStarted","Data":"a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xjkg8" event={"ID":"09cb2800-ce49-44cf-89b5-d1e5459299c5","Type":"ContainerStarted","Data":"46c60a430906056106f7c91de97ac2cc34eae6186a8498f8e398e75fe6cd086e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.609645 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.611190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-5xqfc" event={"ID":"33ce3f8d-5035-4139-b206-f3c36e53618c","Type":"ContainerStarted","Data":"11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.614465 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.614537 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.631439 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.646376 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652118 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652140 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.652177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.658537 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.668163 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.678164 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.691760 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.707004 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.722165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.737985 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755612 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.755812 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.760647 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.761978 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.761953216 +0000 UTC m=+93.300148337 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.769673 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.784173 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.804410 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.821990 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.840529 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.851839 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859267 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.859311 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862817 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862859 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862906 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.862950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863033 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863070 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863093 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863065207 +0000 UTC m=+93.401260338 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863156 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863136979 +0000 UTC m=+93.401332110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863159 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863185 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863191 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863199 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863215 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863245 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863245 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863237882 +0000 UTC m=+93.401433013 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863215 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863390 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863337666 +0000 UTC m=+93.401532807 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:46 crc kubenswrapper[4745]: E0319 00:08:46.863419 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:48.863405908 +0000 UTC m=+93.401601129 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.870655 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.885938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.899036 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.917384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.936404 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.954286 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961767 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961806 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.961853 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:46Z","lastTransitionTime":"2026-03-19T00:08:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.970150 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:46 crc kubenswrapper[4745]: I0319 00:08:46.986083 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.009194 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.025248 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.053254 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064855 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.064985 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137161 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137233 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137288 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.137324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137363 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137532 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137653 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:47 crc kubenswrapper[4745]: E0319 00:08:47.137773 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167369 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167418 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.167466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.270511 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.373826 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476091 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476151 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.476184 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579664 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579881 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.579988 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.580064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.580125 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622344 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622356 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622366 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622376 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.622385 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.624808 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9" exitCode=0 Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.625112 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.641468 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.661818 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.679974 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684473 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684518 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.684566 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.695056 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.709701 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.723599 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.740001 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.754724 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.764278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787911 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.787956 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.793795 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.806957 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.818140 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.832420 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.847163 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891171 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891292 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.891305 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994927 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994975 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.994984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.995000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:47 crc kubenswrapper[4745]: I0319 00:08:47.995012 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:47Z","lastTransitionTime":"2026-03-19T00:08:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098683 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.098827 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204077 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204129 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204145 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.204157 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307632 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.307642 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410104 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.410116 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520455 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.520501 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624374 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.624847 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.630964 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc" exitCode=0 Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.631054 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.662769 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.678342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.695713 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.713573 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727832 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.727953 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.730403 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.749707 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.762600 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.776493 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.782242 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.782969 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.782944137 +0000 UTC m=+97.321139268 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.787911 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.797708 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.821049 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830950 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.830986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.831002 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.836827 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.848970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.859705 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883175 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.883194 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883305 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883343 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883331086 +0000 UTC m=+97.421526217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883401 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883422 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883416378 +0000 UTC m=+97.421611499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883480 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883492 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883504 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883525 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883519842 +0000 UTC m=+97.421714973 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883560 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883580 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883573642 +0000 UTC m=+97.421768773 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883619 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883628 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883637 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: E0319 00:08:48.883655 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:08:52.883649605 +0000 UTC m=+97.421844736 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934048 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934093 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934110 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:48 crc kubenswrapper[4745]: I0319 00:08:48.934122 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:48Z","lastTransitionTime":"2026-03-19T00:08:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.037734 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.137681 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.137972 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138044 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.137963 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.138159 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.138281 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138393 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:49 crc kubenswrapper[4745]: E0319 00:08:49.138496 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145694 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145739 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145791 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.145812 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249313 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249336 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.249352 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352321 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352344 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.352358 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455722 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455795 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455823 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455860 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.455919 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.558595 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.635676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.639757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.643332 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d" exitCode=0 Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.643395 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.658063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.661307 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.674955 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.691931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.715624 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.728526 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.742015 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.754953 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764572 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.764640 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.783052 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.801113 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.814255 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.829709 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.844299 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.859122 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.869207 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.904354 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.918907 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.929124 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.940651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.952731 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.976291 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:49Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:49 crc kubenswrapper[4745]: I0319 00:08:49.983471 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:49Z","lastTransitionTime":"2026-03-19T00:08:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.003719 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.042283 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.065496 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.082044 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086769 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086797 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.086807 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.096162 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.109578 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.123734 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.139606 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.150276 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.150516 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:50 crc kubenswrapper[4745]: E0319 00:08:50.150750 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.153786 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190123 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190138 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.190148 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.292646 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395914 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395971 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.395986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.396008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.396022 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499439 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.499489 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602517 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.602619 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653095 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a" exitCode=0 Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653163 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.653702 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:08:50 crc kubenswrapper[4745]: E0319 00:08:50.653862 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.667651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.679955 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.706205 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.709974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710044 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710085 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.710119 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.722750 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.736804 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.757225 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.771581 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.786168 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.803234 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815698 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815712 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.815751 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.823991 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.838862 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.857048 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.872658 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.887364 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.902674 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:50Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919414 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:50 crc kubenswrapper[4745]: I0319 00:08:50.919447 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:50Z","lastTransitionTime":"2026-03-19T00:08:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.021953 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125582 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125689 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.125705 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137112 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.136985 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137349 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137336 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:51 crc kubenswrapper[4745]: E0319 00:08:51.137447 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229494 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229604 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.229673 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333182 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.333288 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437095 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437107 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437128 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.437141 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539577 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.539604 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.642543 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643046 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.643077 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.662864 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ddaa87b-caf7-46de-b693-96c60909d05e" containerID="e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41" exitCode=0 Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.662987 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerDied","Data":"e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.677037 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.677073 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.706764 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.717674 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.722045 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.736456 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745135 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745147 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.745177 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.761272 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.779238 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.797219 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.813384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.828418 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.841553 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848901 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.848920 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.854949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.867136 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.882321 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.897298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.912469 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.927847 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.945324 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.951803 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:51Z","lastTransitionTime":"2026-03-19T00:08:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.958135 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.969846 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:51 crc kubenswrapper[4745]: I0319 00:08:51.984141 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:51Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.003291 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.017339 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.029870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.054480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058434 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.058555 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.067232 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.078803 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.091060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.100078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.109574 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.120786 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.157296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.161855 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.264999 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.265009 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367700 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.367713 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470637 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470648 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.470678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573299 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573362 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.573391 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675922 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675935 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.675949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.676327 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.683694 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" event={"ID":"9ddaa87b-caf7-46de-b693-96c60909d05e","Type":"ContainerStarted","Data":"129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689055 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689608 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.689658 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.707110 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.719323 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.720652 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.738267 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.752111 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.764412 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780505 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.780517 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.788661 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.802609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.814669 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.826100 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.837578 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.838280 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.838244948 +0000 UTC m=+105.376440079 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.838732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.850541 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.866865 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.883195 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884484 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.884627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.896509 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.913702 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.938115 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.938978 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939115 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939214 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.939186563 +0000 UTC m=+105.477381724 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939219 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939504 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939618 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.939713 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939401 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940085 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940067892 +0000 UTC m=+105.478263023 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939663 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940265 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940345 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940467 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940455284 +0000 UTC m=+105.478650525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939753 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940638 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940617809 +0000 UTC m=+105.478812970 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.939926 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940693 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940714 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: E0319 00:08:52.940766 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:00.940750463 +0000 UTC m=+105.478945624 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.958069 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.973841 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987812 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987924 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.987989 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.988018 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:52Z","lastTransitionTime":"2026-03-19T00:08:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:52 crc kubenswrapper[4745]: I0319 00:08:52.990085 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:52Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.006517 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.027726 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.042532 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.058826 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.075398 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.090994 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.091060 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.091132 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.107400 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137488 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137545 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.137414 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137727 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137927 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:53 crc kubenswrapper[4745]: E0319 00:08:53.137971 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.139233 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.158787 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.175330 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.190071 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194717 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194763 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194782 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194807 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.194825 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.204264 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.217660 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.230901 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:53Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.297963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298000 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298032 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.298046 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400801 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400864 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400946 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.400959 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.504956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.505163 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608430 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608743 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.608974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.609071 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712273 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.712303 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.815223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816134 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.816149 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:53 crc kubenswrapper[4745]: I0319 00:08:53.918678 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:53Z","lastTransitionTime":"2026-03-19T00:08:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022555 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.022665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125333 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125419 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.125435 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228098 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.228187 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.330985 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331043 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.331088 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.434942 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538495 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538571 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538589 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.538636 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644290 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644366 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644423 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.644466 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.701058 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.705148 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" exitCode=1 Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.705227 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.706646 4745 scope.go:117] "RemoveContainer" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.725915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.739664 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.747817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.747871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748031 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748071 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.748097 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.754957 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.771670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.784095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.801171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.817797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.836911 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851734 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851792 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851805 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.851839 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.855870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.871605 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.906152 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.924195 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.940370 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.953342 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956788 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.956968 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:54Z","lastTransitionTime":"2026-03-19T00:08:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.963794 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:54 crc kubenswrapper[4745]: I0319 00:08:54.979067 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:54Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059497 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059508 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059528 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.059540 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.136859 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.136984 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137033 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.137118 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.137007 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137290 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.137462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162742 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162798 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.162872 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267420 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267449 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.267470 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370558 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.370583 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472225 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472256 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472263 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472276 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.472285 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574810 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.574819 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677918 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.677954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.710210 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.712989 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.713374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.725994 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.736949 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.749186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.761897 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780411 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780424 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780445 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.780459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.795238 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.828352 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.846052 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.862388 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.868937 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.874705 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.884361 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.886771 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888350 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888389 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.888415 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.897996 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.900612 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904786 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.904869 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.917301 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.920555 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924490 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924546 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.924557 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.939815 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.940071 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944750 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.944765 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.952746 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.960855 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: E0319 00:08:55.961074 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.962998 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.963022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.963040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:55Z","lastTransitionTime":"2026-03-19T00:08:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.972128 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:55 crc kubenswrapper[4745]: I0319 00:08:55.986133 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:55Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065613 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.065625 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.157320 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.168940 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.168996 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169016 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.169054 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.171158 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.182776 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.197387 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.211283 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.222497 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.235896 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.249870 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.266542 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271556 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271619 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.271637 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.281432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.311282 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.326028 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.340934 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.353240 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.365923 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.374929 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.374985 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.375040 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.390551 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478634 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478693 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478715 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.478780 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582207 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582258 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582268 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.582295 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685162 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685210 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685256 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.685268 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.718587 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.719801 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/0.log" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.723964 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" exitCode=1 Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.724004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.724063 4745 scope.go:117] "RemoveContainer" containerID="61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.725367 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:08:56 crc kubenswrapper[4745]: E0319 00:08:56.725714 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.736717 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.748464 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.762308 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.777454 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790194 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790222 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.790252 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.795650 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.808257 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.824186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.846470 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.860041 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.879797 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892818 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892828 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892842 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.892854 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.898222 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.914020 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.935706 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ff25b9c0ab638ed563ca0fd71dd3203ded980d479838ade5b85a5514558850\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:54Z\\\",\\\"message\\\":\\\"y (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 00:08:54.136181 6563 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0319 00:08:54.136243 6563 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0319 00:08:54.136255 6563 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0319 00:08:54.136284 6563 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0319 00:08:54.136311 6563 handler.go:208] Removed *v1.Node event handler 2\\\\nI0319 00:08:54.136325 6563 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0319 00:08:54.136353 6563 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0319 00:08:54.136376 6563 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0319 00:08:54.136382 6563 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0319 00:08:54.136401 6563 factory.go:656] Stopping watch factory\\\\nI0319 00:08:54.136420 6563 ovnkube.go:599] Stopped ovnkube\\\\nI0319 00:08:54.136443 6563 handler.go:208] Removed *v1.Node event handler 7\\\\nI0319 00:08:54.136453 6563 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0319 00:08:54.136460 6563 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0319 00:08:54.136466 6563 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0319 00\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:51Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.961125 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.977269 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.995458 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996832 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:56 crc kubenswrapper[4745]: I0319 00:08:56.996931 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:56Z","lastTransitionTime":"2026-03-19T00:08:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100152 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.100421 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137434 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137477 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137546 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.137646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137634 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137769 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137864 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.137977 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203125 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.203211 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.305949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306001 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306012 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.306041 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.408903 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409305 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.409539 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.513751 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617745 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617831 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617859 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.617877 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.686811 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.722945 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.723078 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.732153 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.738432 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:08:57 crc kubenswrapper[4745]: E0319 00:08:57.738720 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.762958 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.787058 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.806034 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827011 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827068 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.827130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.829231 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.861854 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.876410 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.890914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.911315 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.926820 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932476 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932507 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.932523 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:57Z","lastTransitionTime":"2026-03-19T00:08:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.961587 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.979483 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:57 crc kubenswrapper[4745]: I0319 00:08:57.996460 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:57Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.009989 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.025331 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035312 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035361 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035377 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.035418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.038341 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.054348 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:08:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.138753 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241174 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241314 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.241362 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344204 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.344290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.447843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448038 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448121 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.448147 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551340 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551389 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.551451 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655124 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.655184 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758365 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.758406 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862451 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862536 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.862614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965709 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965736 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:58 crc kubenswrapper[4745]: I0319 00:08:58.965755 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:58Z","lastTransitionTime":"2026-03-19T00:08:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.069834 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137646 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137697 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.137857 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138004 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138161 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138439 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:08:59 crc kubenswrapper[4745]: E0319 00:08:59.138674 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174214 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174271 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174289 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.174334 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278037 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278139 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278176 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.278199 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381602 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381675 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381696 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.381733 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485733 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.485821 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589248 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589433 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.589445 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.693254 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797302 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797386 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.797399 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900812 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900920 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900939 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:08:59 crc kubenswrapper[4745]: I0319 00:08:59.900984 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:08:59Z","lastTransitionTime":"2026-03-19T00:08:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004454 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004479 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.004496 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107380 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107435 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107470 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.107486 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.211614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315766 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.315967 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.316113 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.419515 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420099 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420134 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.420150 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523830 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523895 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523928 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.523939 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627775 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627874 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.627991 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.628015 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732541 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732561 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.732612 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835861 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835978 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.835997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.836024 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.836043 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.933463 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:00 crc kubenswrapper[4745]: E0319 00:09:00.933874 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:16.933822026 +0000 UTC m=+121.472017187 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940857 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940918 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940949 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:00 crc kubenswrapper[4745]: I0319 00:09:00.940964 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:00Z","lastTransitionTime":"2026-03-19T00:09:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035161 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035220 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035244 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035263 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.035290 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035385 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035408 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035440 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035426403 +0000 UTC m=+121.573621534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035442 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035495 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035538 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035561 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035465 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035448564 +0000 UTC m=+121.573643695 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035715 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035624929 +0000 UTC m=+121.573820100 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035718 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035777 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.035754094 +0000 UTC m=+121.573949305 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035801 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.035837 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.036049 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:17.036005362 +0000 UTC m=+121.574200523 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046070 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046084 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046106 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.046123 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137050 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137101 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137217 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137223 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.137287 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137343 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137486 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:01 crc kubenswrapper[4745]: E0319 00:09:01.137642 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149408 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149485 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.149525 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252261 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252345 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252368 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.252417 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355356 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355399 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.355418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.459569 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562668 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562785 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.562840 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666663 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666755 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.666803 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769680 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.769709 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873595 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.873638 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977796 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977829 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:01 crc kubenswrapper[4745]: I0319 00:09:01.977842 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:01Z","lastTransitionTime":"2026-03-19T00:09:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.079981 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080029 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.080071 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.151980 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.182960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.183225 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285626 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285753 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285764 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.285789 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388279 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388301 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.388316 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497294 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497376 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.497422 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600464 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600499 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.600522 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704026 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704114 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.704162 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807281 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807351 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.807420 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910438 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910519 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:02 crc kubenswrapper[4745]: I0319 00:09:02.910564 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:02Z","lastTransitionTime":"2026-03-19T00:09:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013518 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013596 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.013627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.117990 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118102 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118143 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.118168 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137540 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137555 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.137751 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.137803 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138052 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138175 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:03 crc kubenswrapper[4745]: E0319 00:09:03.138323 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221678 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221725 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221740 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.221769 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324616 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324629 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.324667 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428565 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428585 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.428627 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530681 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530723 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530756 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.530769 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634062 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634073 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634089 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.634100 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.736904 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737004 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.737062 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840347 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.840359 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943331 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943375 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943383 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:03 crc kubenswrapper[4745]: I0319 00:09:03.943405 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:03Z","lastTransitionTime":"2026-03-19T00:09:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047087 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047141 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047165 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.047205 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.149958 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150034 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.150103 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252774 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252819 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252858 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.252909 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357612 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.357657 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461047 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461117 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461139 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.461153 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563809 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563905 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.563941 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666968 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.666997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.667035 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.667053 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.769997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770045 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.770091 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874055 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874128 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.874137 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976727 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976773 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976783 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:04 crc kubenswrapper[4745]: I0319 00:09:04.976810 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:04Z","lastTransitionTime":"2026-03-19T00:09:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085323 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085397 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.085459 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137313 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137421 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137481 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.137334 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137760 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:05 crc kubenswrapper[4745]: E0319 00:09:05.137956 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.138024 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188052 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188096 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188115 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.188151 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291067 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291079 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291097 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.291108 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394390 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.394414 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497284 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.497377 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600266 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.600292 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703839 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703921 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703941 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.703986 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.770667 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.773416 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.773926 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.792105 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806656 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.806690 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.807109 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.821648 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.842814 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.858631 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.880352 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.904781 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910150 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.910222 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:05Z","lastTransitionTime":"2026-03-19T00:09:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.920701 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.943752 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.967029 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:05 crc kubenswrapper[4745]: I0319 00:09:05.993366 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:05Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015155 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015581 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015624 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.015641 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.034078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.070050 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119020 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.119127 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.141519 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.169011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.186721 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.199013 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.212611 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.221092 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.225045 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.236032 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.258669 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.270012 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.289480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.305808 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.321970 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324440 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.324455 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327611 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327651 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.327674 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.340087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.341738 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345238 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.345286 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.356992 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.362649 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366509 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366562 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366592 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.366622 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.373732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.382736 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387007 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387051 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387064 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.387095 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.389361 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.402528 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.403042 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406506 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406539 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.406579 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.418539 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: E0319 00:09:06.418734 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.423377 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432759 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432776 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.432786 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.452008 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.472391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:06Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536403 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536416 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.536449 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640686 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640816 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640841 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.640928 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744623 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744691 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744710 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744738 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.744763 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855645 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855669 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855705 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.855728 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960631 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960714 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960761 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:06 crc kubenswrapper[4745]: I0319 00:09:06.960780 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:06Z","lastTransitionTime":"2026-03-19T00:09:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064333 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064515 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.064534 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.137754 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138007 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138565 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138633 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138687 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138742 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.138793 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:07 crc kubenswrapper[4745]: E0319 00:09:07.138854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.167910 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.167983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168008 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.168066 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272428 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272475 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.272496 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376185 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376298 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.376314 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479790 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479851 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.479913 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583563 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583586 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.583600 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687522 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.687592 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791140 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791158 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791189 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.791210 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894504 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894599 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894630 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.894649 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997101 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997154 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997166 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997183 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:07 crc kubenswrapper[4745]: I0319 00:09:07.997196 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:07Z","lastTransitionTime":"2026-03-19T00:09:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099677 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.099693 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201936 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201974 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.201986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.202002 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305371 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305496 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305542 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.305570 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408720 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408779 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408793 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.408833 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511533 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511550 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511573 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.511590 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615360 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615421 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615436 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615459 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.615475 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718797 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718866 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.718947 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822714 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822813 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822846 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.822910 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.926862 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:08 crc kubenswrapper[4745]: I0319 00:09:08.927647 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:08Z","lastTransitionTime":"2026-03-19T00:09:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031865 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031963 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.031987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.032014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.032035 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135493 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135575 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135635 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.135658 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.136761 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.137129 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137170 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.137546 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137255 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.137196 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.138080 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:09 crc kubenswrapper[4745]: E0319 00:09:09.138091 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239615 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239643 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.239660 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343003 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343081 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.343173 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446817 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446843 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446878 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.446928 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550871 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.550954 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654701 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654713 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654732 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.654744 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758378 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758512 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.758535 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.863193 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.863729 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864076 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.864538 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968040 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968111 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968137 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968169 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:09 crc kubenswrapper[4745]: I0319 00:09:09.968195 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:09Z","lastTransitionTime":"2026-03-19T00:09:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072512 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072567 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072579 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072600 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.072614 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176481 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176548 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176570 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.176620 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280142 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280226 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280255 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.280274 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.382983 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383090 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383132 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383170 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.383195 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486754 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486836 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486859 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486932 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.486957 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590552 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590566 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.590608 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694447 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694520 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694554 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694588 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.694610 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797216 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.797265 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901180 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901206 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:10 crc kubenswrapper[4745]: I0319 00:09:10.901229 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:10Z","lastTransitionTime":"2026-03-19T00:09:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.004731 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005127 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005231 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005329 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.005443 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108083 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108489 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108593 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108690 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.108784 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.137422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.137941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.138141 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.138251 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138159 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138429 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138530 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:11 crc kubenswrapper[4745]: E0319 00:09:11.138782 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212291 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212379 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212400 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.212452 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.315519 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316460 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.316550 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419646 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419711 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419726 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419748 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.419765 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523502 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523587 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523639 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.523665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627837 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627906 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627917 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627937 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.627948 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731058 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731130 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731146 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731173 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.731190 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834594 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834655 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834666 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834688 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.834700 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937847 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937947 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937980 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:11 crc kubenswrapper[4745]: I0319 00:09:11.937995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:11Z","lastTransitionTime":"2026-03-19T00:09:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040605 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040658 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040673 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040695 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.040709 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.138425 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143825 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143923 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143952 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.143986 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.144007 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.248684 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249243 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249282 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.249298 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352601 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352644 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352653 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.352679 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456863 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456931 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456941 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.456974 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579384 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579432 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579462 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.579473 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682706 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682757 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682768 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682789 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.682801 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785480 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785521 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785530 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785544 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.785554 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.805711 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.808790 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.809367 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.825025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.838623 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.850225 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.864799 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.879087 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890197 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890237 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.890276 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.895871 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.910902 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.925948 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.950268 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.970534 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.983732 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993278 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993310 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993342 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:12 crc kubenswrapper[4745]: I0319 00:09:12.993353 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:12Z","lastTransitionTime":"2026-03-19T00:09:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.000068 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:12Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.014515 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.027960 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.044982 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.059168 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.071432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096191 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096228 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096241 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096265 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.096276 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137090 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137180 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137230 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137320 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137339 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.137186 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137592 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.137785 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198707 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198777 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198799 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198833 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.198857 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302188 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302246 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302257 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302280 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.302293 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406131 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406215 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406240 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406270 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.406288 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509108 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509159 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509172 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509190 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.509203 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611849 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611925 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611938 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611960 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.611972 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713827 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713908 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713948 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.713961 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.815007 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.816392 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/1.log" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818075 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818103 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818112 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818126 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.818137 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827566 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" exitCode=1 Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.827698 4745 scope.go:117] "RemoveContainer" containerID="3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.829121 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:13 crc kubenswrapper[4745]: E0319 00:09:13.829449 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.849843 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.863562 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.874036 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.891377 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3ae101afa614985ea4f0257ba789d8f15ca7b94d0039a5744af1107126259d7b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:08:55Z\\\",\\\"message\\\":\\\"Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {7e8bb06a-06a5-45bc-a752-26a17d322811}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Port_Group Row:map[] Rows:[] Columns:[] Mutations:[{Column:ports Mutator:insert Value:{GoSet:[{GoUUID:61897e97-c771-4738-8709-09636387cb00}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {c02bd945-d57b-49ff-9cd3-202ed3574b26}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.4 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {43933d5e-3c3b-4ff8-8926-04ac25de450e}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:Logical_Router Row:map[] Rows:[] Columns:[] Mutations:[{Column:nat Mutator:insert Value:{GoSet:[{GoUUID:43933d5e-3c3b-4ff8-8926-04ac25de450e}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e3c4661a-36a6-47f0-a6c0-a4ee741f2224}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0319 00:08:55.760702 6707 model_client.go:382] Update operations generated as: [{Op:update Table:Logical_Switch_Port Row:map[addresses:{GoSet:[0a:58:0a:d9:00:3b 10.217.0.59]} options:{GoMap:map[iface-id-ver:9d751cbb\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.910480 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921437 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921474 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921483 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.921510 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:13Z","lastTransitionTime":"2026-03-19T00:09:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.922324 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.931460 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.943171 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.952936 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.962914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.973548 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.983006 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:13 crc kubenswrapper[4745]: I0319 00:09:13.994174 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:13Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.004647 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.015332 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024603 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024640 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.024675 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.029152 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.041126 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127667 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127724 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127735 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127751 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.127779 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230033 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.230115 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333821 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333873 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333900 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333919 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.333935 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437049 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437082 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437092 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437105 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.437115 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539926 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539956 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539966 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.539995 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642815 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642845 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642856 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642897 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.642913 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746357 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746401 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746412 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746431 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.746443 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.833173 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.837852 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:14 crc kubenswrapper[4745]: E0319 00:09:14.838062 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850163 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850212 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850232 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850252 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.850264 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.853523 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.871121 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.890228 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.909391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.927423 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.951670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954469 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954670 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954703 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.954739 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:14Z","lastTransitionTime":"2026-03-19T00:09:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.970217 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:14 crc kubenswrapper[4745]: I0319 00:09:14.988213 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:14Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.014035 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.050303 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.057965 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058022 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058039 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058059 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.058076 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.072915 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.089688 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.108636 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.118417 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.124197 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136957 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137157 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136959 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.136958 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.137406 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.138252 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:15 crc kubenswrapper[4745]: E0319 00:09:15.138417 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.142095 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.159670 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162023 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162069 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162080 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162100 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.162113 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.184165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.205189 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.219346 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.235218 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.251134 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265155 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265192 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265202 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265218 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.265229 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.267684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.284981 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.303011 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.318093 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.340554 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.354140 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367826 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.367984 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.368023 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.368044 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.377553 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.390233 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.402279 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.423126 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.438446 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.453301 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.467992 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:15Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471014 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471068 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471088 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471113 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.471130 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574453 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574500 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574513 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.574543 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677456 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677523 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677540 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677564 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.677577 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780868 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780942 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780954 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.780991 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884609 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884654 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884665 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884685 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.884699 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987343 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987395 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987405 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:15 crc kubenswrapper[4745]: I0319 00:09:15.987460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:15Z","lastTransitionTime":"2026-03-19T00:09:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.088540 4745 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.157294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.173837 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.195966 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.211761 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.224806 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.249025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.251616 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.274439 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.287539 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.300956 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.311388 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.323060 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.338099 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.351769 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.365801 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.380391 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.393936 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.407441 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.652930 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653019 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653042 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653078 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.653107 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.669345 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674439 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674517 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674535 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674560 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.674576 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.690931 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696119 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696205 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696230 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696264 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.696290 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.716950 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722274 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722317 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722330 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722353 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.722367 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.737735 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743254 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743316 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743332 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743354 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.743369 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:16Z","lastTransitionTime":"2026-03-19T00:09:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.761098 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:16Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.761276 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:16 crc kubenswrapper[4745]: I0319 00:09:16.955342 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:16 crc kubenswrapper[4745]: E0319 00:09:16.955561 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:09:48.955529704 +0000 UTC m=+153.493724835 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057291 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057358 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057459 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057488 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.057516 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057553 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057745 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057773 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057792 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057839 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057808393 +0000 UTC m=+153.596003564 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057871 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057857564 +0000 UTC m=+153.596052735 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057916 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058013 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058032 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058044 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058013 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.057995499 +0000 UTC m=+153.596190660 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058093 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.058079642 +0000 UTC m=+153.596274793 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.057637 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.058139 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:09:49.058128343 +0000 UTC m=+153.596323494 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137745 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.137913 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.137943 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:17 crc kubenswrapper[4745]: I0319 00:09:17.138026 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138138 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138245 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:17 crc kubenswrapper[4745]: E0319 00:09:17.138345 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137089 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137169 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137091 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137328 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:19 crc kubenswrapper[4745]: I0319 00:09:19.137112 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137242 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137533 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:19 crc kubenswrapper[4745]: E0319 00:09:19.137580 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137097 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137102 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:21 crc kubenswrapper[4745]: I0319 00:09:21.137059 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137284 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137482 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137552 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.137618 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:21 crc kubenswrapper[4745]: E0319 00:09:21.253438 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137473 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137576 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.137637 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:23 crc kubenswrapper[4745]: I0319 00:09:23.137823 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.137825 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.138025 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:23 crc kubenswrapper[4745]: E0319 00:09:23.138136 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.136784 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.137695 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137057 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138015 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137059 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138224 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:25 crc kubenswrapper[4745]: I0319 00:09:25.137087 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:25 crc kubenswrapper[4745]: E0319 00:09:25.138446 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.158466 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.177588 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.194758 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.213906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.227853 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.244078 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: E0319 00:09:26.254389 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.269818 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.283204 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.295278 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.310175 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.326231 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.343647 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.391349 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.415440 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.429622 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.445609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:26 crc kubenswrapper[4745]: I0319 00:09:26.462030 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:26Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131532 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131583 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131597 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131618 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.131633 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137215 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137288 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137498 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.137606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137769 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137807 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.137917 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.146088 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150136 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150175 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150184 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150200 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.150210 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.161955 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165373 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165426 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165446 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.165458 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.176263 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179167 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179199 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179208 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179223 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.179233 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.191198 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195370 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195413 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195425 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195444 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:27 crc kubenswrapper[4745]: I0319 00:09:27.195460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:27Z","lastTransitionTime":"2026-03-19T00:09:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.206593 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:27Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:27 crc kubenswrapper[4745]: E0319 00:09:27.206819 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137531 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137588 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137626 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.137750 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:29 crc kubenswrapper[4745]: I0319 00:09:29.137861 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138122 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138164 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:29 crc kubenswrapper[4745]: E0319 00:09:29.138352 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:30 crc kubenswrapper[4745]: I0319 00:09:30.137752 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:30 crc kubenswrapper[4745]: E0319 00:09:30.138512 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.137424 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.137530 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.138182 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.138411 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138365 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138567 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.138694 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.139272 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:31 crc kubenswrapper[4745]: I0319 00:09:31.152060 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 00:09:31 crc kubenswrapper[4745]: E0319 00:09:31.256740 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905108 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905165 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" exitCode=1 Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905196 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2"} Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.905724 4745 scope.go:117] "RemoveContainer" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.921513 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.936625 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.947184 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.970099 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.981340 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:32 crc kubenswrapper[4745]: I0319 00:09:32.992559 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:32Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.006609 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.018355 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.036376 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.048914 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.061925 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.076739 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.094094 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.104763 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.116667 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.126931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136846 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136913 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136913 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137066 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136935 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137144 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.136939 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137208 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:33 crc kubenswrapper[4745]: E0319 00:09:33.137283 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.160681 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.913210 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.913309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915"} Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.935591 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.949995 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.964056 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.978389 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:33 crc kubenswrapper[4745]: I0319 00:09:33.992864 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:33Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.004259 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.016651 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.028644 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.041293 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.064320 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.084177 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.095611 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.105481 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.119018 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.131294 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.143118 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.156432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:34 crc kubenswrapper[4745]: I0319 00:09:34.168976 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:34Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137302 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137365 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:35 crc kubenswrapper[4745]: I0319 00:09:35.137144 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137450 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:35 crc kubenswrapper[4745]: E0319 00:09:35.137557 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.153063 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.171224 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.190672 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.206035 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.223791 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.238112 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: E0319 00:09:36.257137 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.264979 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.281025 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.293820 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.307413 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.319222 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.343959 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.358186 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.373054 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.386734 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.399896 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.412432 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:36 crc kubenswrapper[4745]: I0319 00:09:36.424599 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:36Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137087 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.137071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137202 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137304 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137353 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.137651 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534915 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534952 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534961 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534976 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.534988 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.555794 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561514 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561590 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561608 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561633 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.561652 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.584782 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590407 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590471 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590549 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.590563 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.614048 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619251 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619304 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619318 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619341 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.619355 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.641413 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646780 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646870 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646944 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.646987 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:37 crc kubenswrapper[4745]: I0319 00:09:37.647010 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:37Z","lastTransitionTime":"2026-03-19T00:09:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.665029 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:37Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:37 crc kubenswrapper[4745]: E0319 00:09:37.665166 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137159 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137292 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137386 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137187 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:39 crc kubenswrapper[4745]: I0319 00:09:39.137187 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137448 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137595 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:39 crc kubenswrapper[4745]: E0319 00:09:39.137914 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137296 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137369 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137414 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:41 crc kubenswrapper[4745]: I0319 00:09:41.137601 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137701 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137841 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.137964 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.138154 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:41 crc kubenswrapper[4745]: E0319 00:09:41.259024 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.139368 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.948063 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.952242 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.952693 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.969461 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.983029 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:42 crc kubenswrapper[4745]: I0319 00:09:42.994927 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.008259 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.020863 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.032680 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.050100 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.069822 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.084931 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.100147 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.118343 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.132034 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136728 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136835 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.136748 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.136965 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137064 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.137119 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137222 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.137362 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.151918 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.183900 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.197583 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.213727 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.232909 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.247026 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.957101 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.958138 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/2.log" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961353 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" exitCode=1 Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961388 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.961424 4745 scope.go:117] "RemoveContainer" containerID="70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.962539 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:43 crc kubenswrapper[4745]: E0319 00:09:43.962806 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.978393 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:43 crc kubenswrapper[4745]: I0319 00:09:43.995757 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:43Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.013667 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.033072 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.051159 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.066690 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.088615 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.106096 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.123287 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.135257 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.145834 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.155254 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.163938 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.180681 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://70232b41968e1b7b7f09ed6549e1ae032019525b78ea97f52dc5495e1886a729\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:13Z\\\",\\\"message\\\":\\\" transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_router_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:169.254.0.2:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {4de02fb8-85f8-4208-9384-785ba5457d16}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:default/kubernetes]} name:Service_default/kubernetes_TCP_node_switch_crc options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.1:443:192.168.126.11:6443]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {b21188fe-5483-4717-afe6-20a41a40b91a}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0319 00:09:12.991936 6932 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.202861 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.218133 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.227983 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.241796 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.967149 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.971017 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:44 crc kubenswrapper[4745]: E0319 00:09:44.971215 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:09:44 crc kubenswrapper[4745]: I0319 00:09:44.987675 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.000338 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:44Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.012111 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.024702 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.035383 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.045738 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.060431 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.072363 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.091379 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.106977 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.122165 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.136312 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137346 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137403 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.137523 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.137348 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.137821 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.138054 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:45 crc kubenswrapper[4745]: E0319 00:09:45.138128 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.176315 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.193298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.220098 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.230698 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.240450 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:45 crc kubenswrapper[4745]: I0319 00:09:45.258004 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:45Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.161098 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.183326 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.200462 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.220076 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.234965 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: E0319 00:09:46.259665 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.261076 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.284143 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.302084 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.323608 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.339174 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.350446 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.371214 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.394092 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.407375 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.419068 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.432742 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.445844 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:46 crc kubenswrapper[4745]: I0319 00:09:46.458015 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:46Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137529 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137582 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.137706 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137713 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.137781 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.137905 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.138066 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.138155 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969916 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969953 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969962 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.969997 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.970008 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:47Z","lastTransitionTime":"2026-03-19T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:47 crc kubenswrapper[4745]: E0319 00:09:47.985782 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:47Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990578 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990649 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990660 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990679 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:47 crc kubenswrapper[4745]: I0319 00:09:47.990690 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:47Z","lastTransitionTime":"2026-03-19T00:09:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.014537 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:47Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020423 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020488 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020501 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.020541 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.041714 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046758 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046808 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046822 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046840 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.046854 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.088180 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093452 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093503 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093529 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093553 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:48 crc kubenswrapper[4745]: I0319 00:09:48.093573 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:48Z","lastTransitionTime":"2026-03-19T00:09:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.109580 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:48Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:48 crc kubenswrapper[4745]: E0319 00:09:48.110100 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.023009 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.023293 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.023253587 +0000 UTC m=+217.561448718 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.124272 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.125973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.124531 4745 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126109 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126086 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126309 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126253185 +0000 UTC m=+217.664448446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126365 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126402 4745 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126514 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126487463 +0000 UTC m=+217.664682754 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.126598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126677 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126701 4745 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126717 4745 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126771 4745 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126783 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126765362 +0000 UTC m=+217.664960653 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.126857 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.126835344 +0000 UTC m=+217.665030495 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.127135 4745 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.127433 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs podName:5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7 nodeName:}" failed. No retries permitted until 2026-03-19 00:10:53.127276889 +0000 UTC m=+217.665472150 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs") pod "network-metrics-daemon-4r5k5" (UID: "5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136808 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136924 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:49 crc kubenswrapper[4745]: I0319 00:09:49.136852 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137284 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137404 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137570 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:49 crc kubenswrapper[4745]: E0319 00:09:49.137775 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137337 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137443 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137523 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137348 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137630 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.137788 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:51 crc kubenswrapper[4745]: I0319 00:09:51.137958 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.138053 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:51 crc kubenswrapper[4745]: E0319 00:09:51.260985 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137157 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137234 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137245 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:53 crc kubenswrapper[4745]: I0319 00:09:53.137331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137381 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137461 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:53 crc kubenswrapper[4745]: E0319 00:09:53.137553 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137896 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137948 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137977 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:55 crc kubenswrapper[4745]: I0319 00:09:55.137875 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138067 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138174 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138306 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:55 crc kubenswrapper[4745]: E0319 00:09:55.138462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.157928 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6fe483ef-54c8-41a8-91ae-0fda4a47fed7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f93107aa3be86576b37990c9874e654b43f2e303394862580702ce3ffa6275a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2d7fddf0917743448eb834b72e27965a5af6f7d70e135760d9ffcc014c93b6b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f4fc9e2672e20b81a5551b0efe9a86c597a606edbb978d3ba56075799cc71f11\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://df56466af3bf5c4ece98c8eaa6b0512236aef953001c2c513d8423573379b59a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f4e54c68d78b7966707dafa49f8bc2a62d46bf727f46a68c1ce46ae0b20ffba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f12835e6286757546cb059ac2b5678648baf37616a16d1506c97a90f4feedc7d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e48969077d0282bee23091bb9fc74d6cf8443e74751eda75e90a4f67758945c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2dc9c08777824f4e910a30b5a18a884798f47086da4117a53ddc3ee437214102\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.172215 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.184137 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://68bd8ff51ebdc9e22a57c8e948f52bdcc547cf689e0f00baf16b3a8ee774a58e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c18f8af435dc93dcf8f70ceed239fa7d3e16f2840cfc8af92519855bc704229e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.193684 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c4250eab-9d3c-457f-9a78-50400c5f65f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bc3206e6ee917d444d56450d487bd4b4b524d261ad03c73de2804563c5f9af31\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbfedf6e33005c947f4baf5639dd319f0affd39efbe44694fd57edb5799781a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-95ftv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-qvl5j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.203924 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-xjkg8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"09cb2800-ce49-44cf-89b5-d1e5459299c5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a2f7d4a57a8d73c3f237207c3205e8ddcbc93213ffb07a1668d7b306e6b368d0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gbtfl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-xjkg8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.221424 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"21835778-c889-4031-b630-586c00f200f9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:42Z\\\",\\\"message\\\":\\\"twork controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:42Z is after 2025-08-24T17:21:41Z]\\\\nI0319 00:09:42.968627 7272 services_controller.go:473] Services do not match for network=default, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-apiserver/apiserver_TCP_cluster\\\\\\\", UUID:\\\\\\\"d71b38eb-32af-4c0f-9490-7c317c111e3a\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-apiserver/apiserver\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]st\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:09:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-kwglz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-w2988\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.236729 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://75200739b9b34f4c8010e1c95750c68a0926505180dc54c5404b15f615bde748\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.248234 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:49Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c03c0829eeaf85ce7515f85dee8a904800881acc1970b40adf250ca20e1432ed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.260433 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j7kx7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4r5k5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: E0319 00:09:56.262155 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.273515 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.286984 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-5xqfc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"33ce3f8d-5035-4139-b206-f3c36e53618c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11c1e5c1426de95c46d5a6b29b4ef42adc59ef83ea6d6ac72422e62775948482\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48rc4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-5xqfc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.299132 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"400972f4-050f-4f26-b982-ced6f2590c8b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4afdeaf96e36a90b2054368f2070d3323ed11fe74c6b31b1a297ba745978295\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jxx4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qt5t5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.310298 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mlwp7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T00:09:32Z\\\",\\\"message\\\":\\\"2026-03-19T00:08:47+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb\\\\n2026-03-19T00:08:47+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_ae9e106b-874c-42b7-9397-978deb0852eb to /host/opt/cni/bin/\\\\n2026-03-19T00:08:47Z [verbose] multus-daemon started\\\\n2026-03-19T00:08:47Z [verbose] Readiness Indicator file check\\\\n2026-03-19T00:09:32Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-p582k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mlwp7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.320304 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4ff57408-3991-436d-a939-3948f686285a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b1e8257c05a50ecf277d4c2c8f7a4d5a5607ab9d22034aa9eca904fd8fad008\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://769ed961b9d7f88a7b4417bc5a6c3f979cb0025b8b9fde3490b3af2170038f0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7d5ed9af1438fc22d155095c9b2e08c97ea91980ed0beeaf490b6112ee5196a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8615e87d67723cff0af84927a6955c4d2cfad25ca671e2bbdcd4b0791629547\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.334372 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bd7a8e96-6428-45d1-90bf-0e26563710a7\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:08:23Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 00:08:22.630486 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 00:08:22.630673 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 00:08:22.632316 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3168303944/tls.crt::/tmp/serving-cert-3168303944/tls.key\\\\\\\"\\\\nI0319 00:08:23.074716 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 00:08:23.076355 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 00:08:23.076372 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 00:08:23.076395 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 00:08:23.076399 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 00:08:23.080007 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 00:08:23.080030 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 00:08:23.080036 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080063 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 00:08:23.080067 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 00:08:23.080071 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 00:08:23.080075 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 00:08:23.080077 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 00:08:23.081566 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:22Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:09:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:19Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:07:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.346906 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d375edce-9ba9-4a9b-ae08-3cf6e0860be2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:07:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ba21927edef5767b8ace5392b94b979c8d6c8688e35c5d0e44857ee17d57364c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://92cfb9e52089729c491ba0546d83c8d1ed4c84725afa73bd3c44871189e4f8d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T00:07:48Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 00:07:18.041090 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 00:07:18.043457 1 observer_polling.go:159] Starting file observer\\\\nI0319 00:07:18.075114 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 00:07:18.078900 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nF0319 00:07:48.444972 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5df15b3e3a9293104462b1e8a8afedd1fcd0f4629ac3b10ed629d63da0ef0686\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a6898776f4346cea12bd73ee4fd2ba5b5cd9f77006883dfc8b3357b897c07903\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:07:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:07:16Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.358613 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:45Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:56 crc kubenswrapper[4745]: I0319 00:09:56.373384 4745 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9ddaa87b-caf7-46de-b693-96c60909d05e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T00:08:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://129058eb4dacca8bc821d43660818fd7d1de0cb24d6d73aa5b71099e1ada536f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T00:08:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f5ee497b4f5d0cc7ffd1196b203ab8db53a1c5423525f4e343097bcd64907771\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9c18715ac296780a7bb2711225cd96c73e22c390f377d8f711fe8ec0eebad3c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea4fe15473733d94685488c0b7326a07e69507f6a78a03b68da7cb28b3ebabc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://61ba97d780d2fcdf7bf82c577f91763c938e2eae3cb9cd5d4351d0b5c3b3df5d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f608bba76481ed6d2ed8fc723a6cbb3ad95476ccb6dac729c4de2d39cab1cc9a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e7895ebfe37cd86b2cfd1817f4ba5f5291b940b0acae871ba44570bc8bd1be41\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T00:08:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T00:08:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qwl4z\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T00:08:45Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-n8tr6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:56Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137707 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137768 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137629 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:57 crc kubenswrapper[4745]: I0319 00:09:57.137629 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137864 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.137954 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:57 crc kubenswrapper[4745]: E0319 00:09:57.138012 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.154423 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167250 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167319 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167364 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.167384 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.184098 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188359 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188415 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188427 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188448 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.188460 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.202219 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206338 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206402 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206417 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206442 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.206456 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.223938 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228638 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228697 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228708 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228728 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.228740 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.244318 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248524 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248591 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248614 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248642 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:09:58 crc kubenswrapper[4745]: I0319 00:09:58.248665 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:09:58Z","lastTransitionTime":"2026-03-19T00:09:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.265705 4745 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T00:09:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f552003b-21de-4401-adf9-0568d3518be8\\\",\\\"systemUUID\\\":\\\"a46e9744-0e29-4d5f-ba27-ee05bebca43c\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T00:09:58Z is after 2025-08-24T17:21:41Z" Mar 19 00:09:58 crc kubenswrapper[4745]: E0319 00:09:58.265913 4745 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.137685 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.137806 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.137833 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138010 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.138075 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.138197 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138229 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.138373 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:09:59 crc kubenswrapper[4745]: I0319 00:09:59.139283 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:09:59 crc kubenswrapper[4745]: E0319 00:09:59.139483 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137309 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.137835 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137404 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.137939 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137365 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:01 crc kubenswrapper[4745]: I0319 00:10:01.137962 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.138010 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.138165 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:01 crc kubenswrapper[4745]: E0319 00:10:01.263595 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137382 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137533 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.137625 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.137556 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.137982 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:03 crc kubenswrapper[4745]: I0319 00:10:03.138147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.138343 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:03 crc kubenswrapper[4745]: E0319 00:10:03.138462 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.137667 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.137717 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.138355 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:05 crc kubenswrapper[4745]: I0319 00:10:05.138819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.143573 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.143810 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.144282 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:05 crc kubenswrapper[4745]: E0319 00:10:05.145099 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.226691 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-5xqfc" podStartSLOduration=132.226669753 podStartE2EDuration="2m12.226669753s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.226138745 +0000 UTC m=+170.764333886" watchObservedRunningTime="2026-03-19 00:10:06.226669753 +0000 UTC m=+170.764864884" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.239438 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podStartSLOduration=132.239417544 podStartE2EDuration="2m12.239417544s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.2393021 +0000 UTC m=+170.777497261" watchObservedRunningTime="2026-03-19 00:10:06.239417544 +0000 UTC m=+170.777612675" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.257497 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-n8tr6" podStartSLOduration=132.257472192 podStartE2EDuration="2m12.257472192s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.256476368 +0000 UTC m=+170.794671499" watchObservedRunningTime="2026-03-19 00:10:06.257472192 +0000 UTC m=+170.795667313" Mar 19 00:10:06 crc kubenswrapper[4745]: E0319 00:10:06.264114 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.277721 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mlwp7" podStartSLOduration=132.277696341 podStartE2EDuration="2m12.277696341s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.275168747 +0000 UTC m=+170.813363888" watchObservedRunningTime="2026-03-19 00:10:06.277696341 +0000 UTC m=+170.815891472" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.296020 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=35.295991666 podStartE2EDuration="35.295991666s" podCreationTimestamp="2026-03-19 00:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.295295173 +0000 UTC m=+170.833490304" watchObservedRunningTime="2026-03-19 00:10:06.295991666 +0000 UTC m=+170.834186797" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.305102 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.305086267 podStartE2EDuration="8.305086267s" podCreationTimestamp="2026-03-19 00:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.304945152 +0000 UTC m=+170.843140313" watchObservedRunningTime="2026-03-19 00:10:06.305086267 +0000 UTC m=+170.843281398" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.324395 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=76.324374295 podStartE2EDuration="1m16.324374295s" podCreationTimestamp="2026-03-19 00:08:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.322376959 +0000 UTC m=+170.860572090" watchObservedRunningTime="2026-03-19 00:10:06.324374295 +0000 UTC m=+170.862569426" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.338659 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=64.338644217 podStartE2EDuration="1m4.338644217s" podCreationTimestamp="2026-03-19 00:09:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.337330633 +0000 UTC m=+170.875525764" watchObservedRunningTime="2026-03-19 00:10:06.338644217 +0000 UTC m=+170.876839348" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.402552 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.402524621 podStartE2EDuration="1m14.402524621s" podCreationTimestamp="2026-03-19 00:08:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.400820025 +0000 UTC m=+170.939015186" watchObservedRunningTime="2026-03-19 00:10:06.402524621 +0000 UTC m=+170.940719762" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.451006 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-qvl5j" podStartSLOduration=132.450981684 podStartE2EDuration="2m12.450981684s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.450652663 +0000 UTC m=+170.988847804" watchObservedRunningTime="2026-03-19 00:10:06.450981684 +0000 UTC m=+170.989176815" Mar 19 00:10:06 crc kubenswrapper[4745]: I0319 00:10:06.464972 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xjkg8" podStartSLOduration=132.464948487 podStartE2EDuration="2m12.464948487s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:06.462848486 +0000 UTC m=+171.001043647" watchObservedRunningTime="2026-03-19 00:10:06.464948487 +0000 UTC m=+171.003143628" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136849 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136936 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136991 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:07 crc kubenswrapper[4745]: I0319 00:10:07.136855 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137075 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137267 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137377 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:07 crc kubenswrapper[4745]: E0319 00:10:07.137455 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435259 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435348 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435367 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435396 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.435418 4745 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T00:10:08Z","lastTransitionTime":"2026-03-19T00:10:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.508819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4"] Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.509598 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513217 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.513916 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.515549 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656565 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656614 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656665 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656748 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.656798 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.757752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758283 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758405 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.757991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758358 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ecd12a08-b896-4cbc-9688-981ca0494b82-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758483 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.758679 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.761793 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ecd12a08-b896-4cbc-9688-981ca0494b82-service-ca\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.766344 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecd12a08-b896-4cbc-9688-981ca0494b82-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.783822 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ecd12a08-b896-4cbc-9688-981ca0494b82-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-b2hh4\" (UID: \"ecd12a08-b896-4cbc-9688-981ca0494b82\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:08 crc kubenswrapper[4745]: I0319 00:10:08.828506 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.057844 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" event={"ID":"ecd12a08-b896-4cbc-9688-981ca0494b82","Type":"ContainerStarted","Data":"c54bb9f661d4b5732582033f80f563740b19734b6d3180d560f25bf5cdd8e4f1"} Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.058365 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" event={"ID":"ecd12a08-b896-4cbc-9688-981ca0494b82","Type":"ContainerStarted","Data":"1a18f51e597541ae70ada00e939b9962390a950a7cf2fd8cee1de8bf8f86d7af"} Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.076435 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-b2hh4" podStartSLOduration=135.07637774 podStartE2EDuration="2m15.07637774s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:09.074163087 +0000 UTC m=+173.612358218" watchObservedRunningTime="2026-03-19 00:10:09.07637774 +0000 UTC m=+173.614572901" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137676 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137753 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.137870 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.137689 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.137993 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.138042 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:09 crc kubenswrapper[4745]: E0319 00:10:09.138090 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.191022 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 00:10:09 crc kubenswrapper[4745]: I0319 00:10:09.204744 4745 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137193 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137718 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:11 crc kubenswrapper[4745]: I0319 00:10:11.137742 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138136 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138122 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138204 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.138289 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:11 crc kubenswrapper[4745]: E0319 00:10:11.266106 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:12 crc kubenswrapper[4745]: I0319 00:10:12.138056 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:10:12 crc kubenswrapper[4745]: E0319 00:10:12.138258 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-w2988_openshift-ovn-kubernetes(21835778-c889-4031-b630-586c00f200f9)\"" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137427 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137586 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137452 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137637 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:13 crc kubenswrapper[4745]: I0319 00:10:13.137439 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137672 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:13 crc kubenswrapper[4745]: E0319 00:10:13.137706 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137428 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137497 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:15 crc kubenswrapper[4745]: I0319 00:10:15.137428 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137602 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137657 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137738 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:15 crc kubenswrapper[4745]: E0319 00:10:15.137821 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:16 crc kubenswrapper[4745]: E0319 00:10:16.266658 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137147 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137249 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137283 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137311 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:17 crc kubenswrapper[4745]: I0319 00:10:17.137324 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137402 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137515 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:17 crc kubenswrapper[4745]: E0319 00:10:17.137611 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.093301 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.093940 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/0.log" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094027 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" exitCode=1 Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094074 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915"} Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094162 4745 scope.go:117] "RemoveContainer" containerID="7a89de0a3da543abda10e4ef32f70de519d42c13017bb860834cb9cd5693a8a2" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.094591 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.094814 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136768 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136801 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.136935 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137019 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:19 crc kubenswrapper[4745]: I0319 00:10:19.137115 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137117 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137219 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:19 crc kubenswrapper[4745]: E0319 00:10:19.137421 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:20 crc kubenswrapper[4745]: I0319 00:10:20.099724 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137197 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137241 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138426 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138169 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:21 crc kubenswrapper[4745]: I0319 00:10:21.137257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138560 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.138695 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:21 crc kubenswrapper[4745]: E0319 00:10:21.268110 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137696 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137749 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137782 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.137837 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:23 crc kubenswrapper[4745]: I0319 00:10:23.137713 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.137981 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.138047 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:23 crc kubenswrapper[4745]: E0319 00:10:23.138102 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137109 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.137263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137446 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137553 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.137724 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:25 crc kubenswrapper[4745]: I0319 00:10:25.138013 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:25 crc kubenswrapper[4745]: E0319 00:10:25.138124 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:26 crc kubenswrapper[4745]: I0319 00:10:26.139006 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:10:26 crc kubenswrapper[4745]: E0319 00:10:26.268538 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.021638 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4r5k5"] Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.021829 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.022070 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.125145 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.127838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerStarted","Data":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.128244 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137042 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137073 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.137047 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137175 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137326 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:27 crc kubenswrapper[4745]: E0319 00:10:27.137357 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:27 crc kubenswrapper[4745]: I0319 00:10:27.157739 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podStartSLOduration=153.157722351 podStartE2EDuration="2m33.157722351s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:27.157666609 +0000 UTC m=+191.695861760" watchObservedRunningTime="2026-03-19 00:10:27.157722351 +0000 UTC m=+191.695917482" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137636 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137665 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:29 crc kubenswrapper[4745]: I0319 00:10:29.137635 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137855 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137756 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.137962 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:29 crc kubenswrapper[4745]: E0319 00:10:29.138022 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:30 crc kubenswrapper[4745]: I0319 00:10:30.137585 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137226 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137299 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.137717 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137371 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.137361 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.137851 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.138029 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.138118 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.142403 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:10:31 crc kubenswrapper[4745]: I0319 00:10:31.142455 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8"} Mar 19 00:10:31 crc kubenswrapper[4745]: E0319 00:10:31.270657 4745 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137371 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137463 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137534 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137533 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137628 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:33 crc kubenswrapper[4745]: I0319 00:10:33.137678 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137767 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:33 crc kubenswrapper[4745]: E0319 00:10:33.137840 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137407 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137498 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137518 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:35 crc kubenswrapper[4745]: I0319 00:10:35.137505 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137678 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.137963 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 00:10:35 crc kubenswrapper[4745]: E0319 00:10:35.138036 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4r5k5" podUID="5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137038 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137095 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137124 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.137309 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.142230 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143212 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143401 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143475 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.143831 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 00:10:37 crc kubenswrapper[4745]: I0319 00:10:37.144278 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 00:10:38 crc kubenswrapper[4745]: I0319 00:10:38.981388 4745 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.027999 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.028611 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.029017 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.029517 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.031736 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.032248 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.032270 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.033257 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.033608 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.034569 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.034581 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.035678 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.035708 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.036855 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.036895 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.037491 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.037802 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.038271 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.043711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.044403 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.044813 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.046689 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047257 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047570 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.047955 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049332 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049566 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.049681 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.050467 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.050495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.051426 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.052092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.054526 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.054841 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.055274 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.056303 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.056316 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.058970 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.059848 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.070990 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.071279 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.072219 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.073306 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.073585 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.081383 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.081982 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.082193 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.082395 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.083376 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.083536 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.084147 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.084334 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.085656 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.086859 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.087364 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.087452 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.088012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.088166 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.089648 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108605 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108641 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108664 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108682 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108705 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108723 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108743 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108759 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108776 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108794 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108819 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108858 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108873 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108912 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108927 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108945 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108962 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108976 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.108991 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109038 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109065 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109105 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109123 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109143 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109180 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109219 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109256 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109273 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109290 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109304 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109339 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109354 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109368 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109385 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109400 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.109575 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110007 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110087 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110121 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110177 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110203 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110328 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110456 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110529 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110625 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.110640 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.111190 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113034 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113059 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113202 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113282 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113415 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113608 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113669 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113602 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113936 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114074 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114492 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114588 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114670 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114756 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114807 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.114927 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.113203 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115130 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115196 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"serviceca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115327 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115356 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115487 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115549 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115584 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115671 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115700 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115742 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115807 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115496 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.115811 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.117009 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"pruner-dockercfg-p7bcw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.118455 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.118957 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.119367 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.122049 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.122722 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.123228 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.123755 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.125923 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126061 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126158 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.126533 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.127165 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.131467 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-jrq7v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132429 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132542 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.132773 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.133623 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.135040 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.135493 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.138841 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.139053 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.139322 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140181 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140679 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.140824 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141134 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141516 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.141685 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142122 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142280 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.142406 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.163054 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.163869 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.164412 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.165143 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.167204 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.169663 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.174126 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.202663 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.204212 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.205153 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.206448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.206761 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.207702 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.208002 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.208123 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210988 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211099 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210810 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210498 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.210597 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211177 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211612 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211680 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211774 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211807 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211834 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211903 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211928 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211954 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211985 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212012 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212038 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212087 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212115 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212136 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212181 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212239 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212270 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212292 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212320 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212345 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212366 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212423 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212457 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212482 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212501 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212544 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212565 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212584 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212603 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212622 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212659 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212678 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212696 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212731 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212748 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212787 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212806 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212842 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212859 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212936 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212953 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212968 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.212984 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213018 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213038 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213107 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213125 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213140 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.213871 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214008 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214947 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.214977 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215003 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215021 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215040 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215063 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215096 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215114 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215133 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215173 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215192 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215210 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215232 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215265 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215312 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215328 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.215353 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216239 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.211777 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216813 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-audit\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-service-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.216960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-audit-dir\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.217798 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.217955 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-image-import-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.218565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.219342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83e036dd-6b1a-48ec-a9f4-a976673a6208-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.221992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-policies\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222357 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d82019fe-0d36-4087-83db-41c03fa4fc66-config\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.222965 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bee68b29-e3e7-4a15-9bda-981764261dcc-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223031 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bee68b29-e3e7-4a15-9bda-981764261dcc-audit-dir\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223196 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-encryption-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223283 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223333 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/46576b1f-4646-44ba-a896-d509b05801cd-node-pullsecrets\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-config\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223636 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-etcd-client\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.223776 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-config\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224497 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224654 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.224813 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.225057 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-encryption-config\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.225468 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-etcd-serving-ca\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.226625 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-serving-cert\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.226668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bee68b29-e3e7-4a15-9bda-981764261dcc-serving-cert\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.227835 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.228586 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/660e3fac-6534-49e0-a81e-38971c9fec3f-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.230699 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d82019fe-0d36-4087-83db-41c03fa4fc66-serving-cert\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.231536 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/46576b1f-4646-44ba-a896-d509b05801cd-etcd-client\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.232668 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.235077 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.235623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46576b1f-4646-44ba-a896-d509b05801cd-trusted-ca-bundle\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.237295 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.239162 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/660e3fac-6534-49e0-a81e-38971c9fec3f-images\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.239233 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.240186 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.240291 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.241105 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.243570 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244184 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244311 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83e036dd-6b1a-48ec-a9f4-a976673a6208-config\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.244916 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.245106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.245869 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.250358 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.251198 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.251711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252113 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252151 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.252455 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.253488 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254569 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254605 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.254895 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.255739 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.256158 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.257107 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.258410 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259165 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259341 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.259984 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.260855 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.262143 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.263498 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.268555 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.269714 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.271227 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.272379 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.275538 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-8m25n"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.281561 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.282168 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.284815 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.285650 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.288592 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.291112 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.291250 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.292718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.293777 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.295062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.296228 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.297624 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.299988 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.301168 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.302764 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.304029 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.304849 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.305988 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.307192 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.308526 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.310415 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.311200 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.311428 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.312864 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.313006 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.314110 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.315218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.315976 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316006 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316029 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316051 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316090 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316145 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316231 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316301 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316316 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316336 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316354 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316428 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316448 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316484 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316518 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316537 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316563 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316579 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316597 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316617 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316635 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316656 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316683 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316703 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316724 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316829 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316846 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316862 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316898 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316936 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316963 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.316991 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317045 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317103 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317150 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-auth-proxy-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317152 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317330 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317824 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318135 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-service-ca\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318818 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318942 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab89302b-10a8-43fa-ad93-699274acaac3-config\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.318961 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.317846 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.319482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.319980 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320388 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-trusted-ca-bundle\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320589 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-config\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320921 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.320957 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/19580e75-5123-4261-ac5c-96dbd7834613-available-featuregates\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.321950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322209 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-trusted-ca\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322475 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322536 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.322552 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.324346 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-oauth-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.324726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-serving-cert\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325099 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/053b13b0-078a-45ea-a005-e38aab17b42f-console-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325138 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325165 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/199f2552-58de-4ea8-adf5-f1aee925f49b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325552 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/053b13b0-078a-45ea-a005-e38aab17b42f-console-oauth-config\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.325782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd804c03-3021-44bd-8ce8-a10a482c59b4-config\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.326726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/199f2552-58de-4ea8-adf5-f1aee925f49b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327200 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327234 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327381 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.327524 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-machine-approver-tls\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.328347 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.329385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.329461 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330662 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.331342 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.331983 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19580e75-5123-4261-ac5c-96dbd7834613-serving-cert\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.332214 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.330232 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.333291 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.333727 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-stats-auth\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.334459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.335893 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fd804c03-3021-44bd-8ce8-a10a482c59b4-serving-cert\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.336106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab89302b-10a8-43fa-ad93-699274acaac3-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.336847 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.337135 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.338479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-default-certificate\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.341548 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-metrics-certs\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.345274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.351214 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.357589 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-service-ca-bundle\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.371272 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.390634 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.410604 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.411382 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-config\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.431358 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.436754 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-serving-cert\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.451576 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.459979 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-client\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.471610 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.480095 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-service-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.491479 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.512070 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.513861 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/116f15d2-ff67-4a98-846a-29bd6a129bbd-etcd-ca\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.531988 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.562257 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.571873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.591571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.636092 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.651397 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.671518 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.691872 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.712046 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.731907 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.751509 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.773246 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.791357 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.811321 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.831936 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.851913 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.892423 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.898510 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"controller-manager-879f6c89f-v4wtx\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.911124 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.932439 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.958824 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 00:10:39 crc kubenswrapper[4745]: I0319 00:10:39.970830 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.009833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.012051 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.017364 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"route-controller-manager-6576b87f9c-bjlzj\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.031477 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.033575 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.051914 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.071512 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.110393 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.112450 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.153064 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.159639 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tl6fn\" (UniqueName: \"kubernetes.io/projected/83e036dd-6b1a-48ec-a9f4-a976673a6208-kube-api-access-tl6fn\") pod \"openshift-apiserver-operator-796bbdcf4f-445gs\" (UID: \"83e036dd-6b1a-48ec-a9f4-a976673a6208\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.171869 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.191246 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.223944 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.229094 4745 request.go:700] Waited for 1.006551228s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication-operator/serviceaccounts/authentication-operator/token Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.230746 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhxsw\" (UniqueName: \"kubernetes.io/projected/46576b1f-4646-44ba-a896-d509b05801cd-kube-api-access-vhxsw\") pod \"apiserver-76f77b778f-zxjjt\" (UID: \"46576b1f-4646-44ba-a896-d509b05801cd\") " pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.249493 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfct\" (UniqueName: \"kubernetes.io/projected/d82019fe-0d36-4087-83db-41c03fa4fc66-kube-api-access-8tfct\") pod \"authentication-operator-69f744f599-h477l\" (UID: \"d82019fe-0d36-4087-83db-41c03fa4fc66\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.250986 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.259235 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16dafcd2_537e_46fe_8028_41bc6ff146a0.slice/crio-4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba WatchSource:0}: Error finding container 4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba: Status 404 returned error can't find the container with id 4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.265176 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.273142 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxp4h\" (UniqueName: \"kubernetes.io/projected/bee68b29-e3e7-4a15-9bda-981764261dcc-kube-api-access-rxp4h\") pod \"apiserver-7bbb656c7d-lpdzv\" (UID: \"bee68b29-e3e7-4a15-9bda-981764261dcc\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.289093 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8bxx\" (UniqueName: \"kubernetes.io/projected/660e3fac-6534-49e0-a81e-38971c9fec3f-kube-api-access-x8bxx\") pod \"machine-api-operator-5694c8668f-hg72d\" (UID: \"660e3fac-6534-49e0-a81e-38971c9fec3f\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.293057 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.311311 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.324145 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.330824 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.351505 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.371477 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.390765 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.404479 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.412402 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.431199 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.434265 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.451430 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.456294 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-h477l"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.471852 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.480045 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd82019fe_0d36_4087_83db_41c03fa4fc66.slice/crio-206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59 WatchSource:0}: Error finding container 206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59: Status 404 returned error can't find the container with id 206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59 Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.491063 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.512965 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.530355 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-zxjjt"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.548799 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46576b1f_4646_44ba_a896_d509b05801cd.slice/crio-024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9 WatchSource:0}: Error finding container 024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9: Status 404 returned error can't find the container with id 024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9 Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.551092 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.567249 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.572503 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.590839 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.611022 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.620345 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.629537 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83e036dd_6b1a_48ec_a9f4_a976673a6208.slice/crio-592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea WatchSource:0}: Error finding container 592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea: Status 404 returned error can't find the container with id 592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.631946 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.652869 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.653268 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv"] Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.676557 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.692311 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.712216 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.767776 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.768347 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.770394 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.792265 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.797416 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-hg72d"] Mar 19 00:10:40 crc kubenswrapper[4745]: W0319 00:10:40.809585 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod660e3fac_6534_49e0_a81e_38971c9fec3f.slice/crio-763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df WatchSource:0}: Error finding container 763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df: Status 404 returned error can't find the container with id 763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.810421 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.830990 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.851504 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.870797 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.891628 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.911597 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.932163 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.951977 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.970339 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 00:10:40 crc kubenswrapper[4745]: I0319 00:10:40.993951 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.013489 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.032127 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.051325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.071231 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.091399 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.111755 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.131083 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.150626 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.171219 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.192418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.211154 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerStarted","Data":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerStarted","Data":"4cb2967d4cad71f05028927d26e36d1cb64a836920dd56873c07f0bfdd7a7214"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.212944 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215213 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerStarted","Data":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215277 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerStarted","Data":"4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.215381 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.217267 4745 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-v4wtx container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.217349 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.219851 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" event={"ID":"83e036dd-6b1a-48ec-a9f4-a976673a6208","Type":"ContainerStarted","Data":"1247c1f125243c467fd6c5c4871847fc5bd9c2c4772f9df2c85f3882410c62db"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.219919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" event={"ID":"83e036dd-6b1a-48ec-a9f4-a976673a6208","Type":"ContainerStarted","Data":"592e50c93e589da5412df7388373e65ebf8a92b12ad4bdfe866440f8b3ee5dea"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.229033 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" event={"ID":"d82019fe-0d36-4087-83db-41c03fa4fc66","Type":"ContainerStarted","Data":"f333d55613085cf065f0326ce3515bb88d1b269111e254b53a6bdee309dc10d8"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.229100 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" event={"ID":"d82019fe-0d36-4087-83db-41c03fa4fc66","Type":"ContainerStarted","Data":"206dbc7415f07b101faed205fca7d56e9c46ca95fa8268a4a85f25e579d05c59"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.230547 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233298 4745 generic.go:334] "Generic (PLEG): container finished" podID="46576b1f-4646-44ba-a896-d509b05801cd" containerID="2b6247b74f321e3c9fe6e40c745a20e16808f6be322f361b906cd1acad47a45c" exitCode=0 Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233467 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerDied","Data":"2b6247b74f321e3c9fe6e40c745a20e16808f6be322f361b906cd1acad47a45c"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.233526 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"024541ff85ddb6b0038e11579ec2a88508155e56b4fc962bb2fc758deb3c6ed9"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"def9e2832bd01bc34e82a756dd4b7c55d6daabcbcf53dab3ac06e388b10d54f0"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237190 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"c1e5ff3bf3ede6287e2aa318febf0edcffa1143f843a940ba2865e8265d8d20f"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.237204 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" event={"ID":"660e3fac-6534-49e0-a81e-38971c9fec3f","Type":"ContainerStarted","Data":"763d62f0cde3fdb647231e2cf21dddd2e9d6cc7c67dc8e556eda3ff5255126df"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.239733 4745 generic.go:334] "Generic (PLEG): container finished" podID="bee68b29-e3e7-4a15-9bda-981764261dcc" containerID="e7c145419018b8d03972266370c33562f5717fa3cfbdb424202a6f3c7a02a817" exitCode=0 Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.239867 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerDied","Data":"e7c145419018b8d03972266370c33562f5717fa3cfbdb424202a6f3c7a02a817"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.240085 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerStarted","Data":"77b4922577ab245ca2c7b782a0c40bdddaf3482c9e9f3098984860ebf3a3ba17"} Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.250006 4745 request.go:700] Waited for 1.932693634s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-etcd-operator/serviceaccounts/etcd-operator/token Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.282232 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fknf6\" (UniqueName: \"kubernetes.io/projected/116f15d2-ff67-4a98-846a-29bd6a129bbd-kube-api-access-fknf6\") pod \"etcd-operator-b45778765-c8nmg\" (UID: \"116f15d2-ff67-4a98-846a-29bd6a129bbd\") " pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.289859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rssq\" (UniqueName: \"kubernetes.io/projected/053b13b0-078a-45ea-a005-e38aab17b42f-kube-api-access-7rssq\") pod \"console-f9d7485db-ssbjs\" (UID: \"053b13b0-078a-45ea-a005-e38aab17b42f\") " pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.318711 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsb2s\" (UniqueName: \"kubernetes.io/projected/19580e75-5123-4261-ac5c-96dbd7834613-kube-api-access-tsb2s\") pod \"openshift-config-operator-7777fb866f-5gsjw\" (UID: \"19580e75-5123-4261-ac5c-96dbd7834613\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.328507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"image-pruner-29564640-xrq9h\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.357346 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"oauth-openshift-558db77b4-522nc\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.366858 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.391160 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.398191 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ab89302b-10a8-43fa-ad93-699274acaac3-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-8xctf\" (UID: \"ab89302b-10a8-43fa-ad93-699274acaac3\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.398315 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzlgl\" (UniqueName: \"kubernetes.io/projected/199f2552-58de-4ea8-adf5-f1aee925f49b-kube-api-access-fzlgl\") pod \"openshift-controller-manager-operator-756b6f6bc6-74pfb\" (UID: \"199f2552-58de-4ea8-adf5-f1aee925f49b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.405232 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.412082 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.415092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.431854 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.451269 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.465513 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.468200 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.491154 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.492457 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2fgf\" (UniqueName: \"kubernetes.io/projected/38dd3b53-64de-4201-b427-0b1bc3e51849-kube-api-access-z2fgf\") pod \"downloads-7954f5f757-ljtrr\" (UID: \"38dd3b53-64de-4201-b427-0b1bc3e51849\") " pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.524039 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5npf\" (UniqueName: \"kubernetes.io/projected/fd804c03-3021-44bd-8ce8-a10a482c59b4-kube-api-access-x5npf\") pod \"console-operator-58897d9998-fpxzh\" (UID: \"fd804c03-3021-44bd-8ce8-a10a482c59b4\") " pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.574266 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.601734 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cw6nw\" (UniqueName: \"kubernetes.io/projected/5c3bf4d2-ad08-42d0-bd92-f94074fc4833-kube-api-access-cw6nw\") pod \"router-default-5444994796-jrq7v\" (UID: \"5c3bf4d2-ad08-42d0-bd92-f94074fc4833\") " pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.603439 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdpdj\" (UniqueName: \"kubernetes.io/projected/a43287ca-c4a1-424a-86b5-f7f4f1a627a9-kube-api-access-tdpdj\") pod \"machine-approver-56656f9798-kjmjw\" (UID: \"a43287ca-c4a1-424a-86b5-f7f4f1a627a9\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.612120 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jnqh\" (UniqueName: \"kubernetes.io/projected/1cee17e3-a84a-42b3-8cbf-9e4dd2c76330-kube-api-access-9jnqh\") pod \"cluster-samples-operator-665b6dd947-m7nb5\" (UID: \"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.653830 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.683917 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685112 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685159 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685227 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685262 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685278 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685295 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685330 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685453 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.685537 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686074 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686099 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686277 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686305 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.686948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687264 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687292 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687331 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687349 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.687413 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688670 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688777 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688828 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.688848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.689198 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.689688 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.189674858 +0000 UTC m=+206.727869989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690015 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690047 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690092 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690132 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690214 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.690271 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.691517 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.694308 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.762446 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.776339 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.784882 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.793465 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.793719 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.293680893 +0000 UTC m=+206.831876024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794218 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794663 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794748 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794782 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794834 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794869 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794936 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794966 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.794990 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795027 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795098 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795126 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795185 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795287 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795371 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795445 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.795471 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796837 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796900 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796929 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796957 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.796996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797025 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797144 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797215 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797267 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797304 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797362 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797421 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797449 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797499 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797533 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797595 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.797182 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-config\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.801479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-serving-cert\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.804434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.805147 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-apiservice-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.805240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.816013 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-pruner-29564640-xrq9h"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.816117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlw2\" (UniqueName: \"kubernetes.io/projected/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-kube-api-access-4vlw2\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.818741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819028 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819081 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819106 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819131 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819160 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819270 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819293 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819312 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819350 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.819748 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.820379 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0a51c238-7c2c-470f-8123-472327367ec8-trusted-ca\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.820813 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.320795555 +0000 UTC m=+206.858990686 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.820967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.821969 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4394045d-753a-4e2b-8ea5-7087d41481d2-tmpfs\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822071 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c46c928-9116-4548-b20e-d9a66d439012-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822520 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.822566 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.824330 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.824997 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-cabundle\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825169 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825222 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825644 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.825803 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826525 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826602 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826649 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.826876 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827281 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827328 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827351 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.827392 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829269 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829522 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829622 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.829666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.835163 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.836004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837354 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.837483 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0a51c238-7c2c-470f-8123-472327367ec8-metrics-tls\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.838287 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/5e57c8e3-d9a0-42bc-98d3-336656039e9c-signing-key\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.836702 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.838970 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-metrics-tls\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.842753 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/cb0a157b-0f6d-4738-ae67-e29407c2ba8e-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-fztjk\" (UID: \"cb0a157b-0f6d-4738-ae67-e29407c2ba8e\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.846670 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4394045d-753a-4e2b-8ea5-7087d41481d2-webhook-cert\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.849903 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4c46c928-9116-4548-b20e-d9a66d439012-proxy-tls\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.877384 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qzmc\" (UniqueName: \"kubernetes.io/projected/5c970483-eb0d-49da-b0bc-d1685f8bf7f1-kube-api-access-9qzmc\") pod \"service-ca-operator-777779d784-g2gxp\" (UID: \"5c970483-eb0d-49da-b0bc-d1685f8bf7f1\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.904949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"marketplace-operator-79b997595-qn8c4\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.930728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.930943 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931062 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931100 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931139 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931161 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931231 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931248 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931272 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931289 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931325 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931341 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931359 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931379 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931398 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931430 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931461 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931478 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931493 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931511 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931548 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931578 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931604 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931639 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931677 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931700 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931886 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931919 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931946 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.931962 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.932845 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-csi-data-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.933557 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-config\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.933884 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-images\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.934497 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-socket-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: E0319 00:10:41.934812 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.434791091 +0000 UTC m=+206.972986232 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.935315 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-mountpoint-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.938821 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-plugins-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939235 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ead6910f-478b-45b5-b83c-06d3733635cb-config-volume\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939490 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b00ac21-bd51-4aff-a8fc-d14d2b930940-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.939960 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfhj\" (UniqueName: \"kubernetes.io/projected/5e57c8e3-d9a0-42bc-98d3-336656039e9c-kube-api-access-ggfhj\") pod \"service-ca-9c57cc56f-t99wg\" (UID: \"5e57c8e3-d9a0-42bc-98d3-336656039e9c\") " pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.949906 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b00ac21-bd51-4aff-a8fc-d14d2b930940-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.949992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b7b46d81-2c53-4021-be8a-f962c576a94c-registration-dir\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.950493 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-auth-proxy-config\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.956623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.968517 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-srv-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.969765 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-ssbjs"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.970395 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973038 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-profile-collector-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973075 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973441 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw"] Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.973858 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.974276 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/11217236-1702-4cd3-b097-e3e410cbbdb4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.976850 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4e2ca532-9e45-44d5-b541-3e9b34352d75-srv-cert\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.977289 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ead6910f-478b-45b5-b83c-06d3733635cb-metrics-tls\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.977991 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-proxy-tls\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.978949 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.981938 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-cert\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982305 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-certs\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.982531 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-bound-sa-token\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.983436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-node-bootstrap-token\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:41 crc kubenswrapper[4745]: I0319 00:10:41.992281 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ffa72fe8-3870-425b-81ff-cb17f6d3ac1c-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-z9bmr\" (UID: \"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.012515 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jx67\" (UniqueName: \"kubernetes.io/projected/4c46c928-9116-4548-b20e-d9a66d439012-kube-api-access-8jx67\") pod \"machine-config-controller-84d6567774-5g88t\" (UID: \"4c46c928-9116-4548-b20e-d9a66d439012\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.018229 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.032660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.033037 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.533023865 +0000 UTC m=+207.071218996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.041481 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mp62\" (UniqueName: \"kubernetes.io/projected/c6b29d2c-9784-4f10-b0d1-09e88ddf5df1-kube-api-access-6mp62\") pod \"dns-operator-744455d44c-j2mf5\" (UID: \"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1\") " pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.056129 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwkxl\" (UniqueName: \"kubernetes.io/projected/0a51c238-7c2c-470f-8123-472327367ec8-kube-api-access-vwkxl\") pod \"ingress-operator-5b745b69d9-j9cxt\" (UID: \"0a51c238-7c2c-470f-8123-472327367ec8\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.090069 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrgfx\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-kube-api-access-qrgfx\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.098534 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7c88ec95-11d6-4ef9-8ba8-2bf0495e4474-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-8xwnq\" (UID: \"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.104495 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.110314 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.118796 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2bks\" (UniqueName: \"kubernetes.io/projected/4394045d-753a-4e2b-8ea5-7087d41481d2-kube-api-access-f2bks\") pod \"packageserver-d55dfcdfc-pknnh\" (UID: \"4394045d-753a-4e2b-8ea5-7087d41481d2\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.119109 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.128113 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.133281 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.133755 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.63373558 +0000 UTC m=+207.171930711 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.138506 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.147014 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.156593 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.163486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"auto-csr-approver-29564650-7k6ld\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.181879 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.182534 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.188912 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jt2wh\" (UniqueName: \"kubernetes.io/projected/4e2ca532-9e45-44d5-b541-3e9b34352d75-kube-api-access-jt2wh\") pod \"catalog-operator-68c6474976-wz97d\" (UID: \"4e2ca532-9e45-44d5-b541-3e9b34352d75\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.193642 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.202037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jw6qq\" (UniqueName: \"kubernetes.io/projected/2b8f7226-6033-4b1b-bd4d-7c045b9d60ef-kube-api-access-jw6qq\") pod \"package-server-manager-789f6589d5-q4x24\" (UID: \"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.210449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pvv\" (UniqueName: \"kubernetes.io/projected/11217236-1702-4cd3-b097-e3e410cbbdb4-kube-api-access-d5pvv\") pod \"multus-admission-controller-857f4d67dd-v2dhg\" (UID: \"11217236-1702-4cd3-b097-e3e410cbbdb4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.212761 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.227377 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.234567 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.235028 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.735008115 +0000 UTC m=+207.273203246 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.235805 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj8wf\" (UniqueName: \"kubernetes.io/projected/10e7df4c-01da-48e6-8d5d-20e788cd4cdf-kube-api-access-wj8wf\") pod \"ingress-canary-m56ls\" (UID: \"10e7df4c-01da-48e6-8d5d-20e788cd4cdf\") " pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.272625 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.274211 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2txg4\" (UniqueName: \"kubernetes.io/projected/b7b46d81-2c53-4021-be8a-f962c576a94c-kube-api-access-2txg4\") pod \"csi-hostpathplugin-54kzj\" (UID: \"b7b46d81-2c53-4021-be8a-f962c576a94c\") " pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.274775 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ssbjs" event={"ID":"053b13b0-078a-45ea-a005-e38aab17b42f","Type":"ContainerStarted","Data":"4599ea118ed41967dec054d250458ddd9d4446dbf5e68fcd9b9416f57bb90d83"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.284657 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.296458 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptmgs\" (UniqueName: \"kubernetes.io/projected/ead6910f-478b-45b5-b83c-06d3733635cb-kube-api-access-ptmgs\") pod \"dns-default-t28kd\" (UID: \"ead6910f-478b-45b5-b83c-06d3733635cb\") " pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.297695 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.299148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrq7v" event={"ID":"5c3bf4d2-ad08-42d0-bd92-f94074fc4833","Type":"ContainerStarted","Data":"0fce18da0d381d684e8f3380652feea36745ca4b26730b86dc8d697b920e6fe4"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.299275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7m7w\" (UniqueName: \"kubernetes.io/projected/07e3a371-481b-4e06-a2e9-e12f3ec3d28b-kube-api-access-g7m7w\") pod \"machine-config-operator-74547568cd-p29f9\" (UID: \"07e3a371-481b-4e06-a2e9-e12f3ec3d28b\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.306587 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.313622 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" event={"ID":"ab89302b-10a8-43fa-ad93-699274acaac3","Type":"ContainerStarted","Data":"d338496d01871358c0d340b96464ff899bf3fd9be588fe946dcb971739923471"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.337750 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-c8nmg"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.337824 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-m56ls" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.338259 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.338553 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.838527524 +0000 UTC m=+207.376722665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.346348 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.346861 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.84684622 +0000 UTC m=+207.385041351 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.348170 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerStarted","Data":"d6f1707d3a61337a62dd4d3650b8c4ac2606e8c3177922a6463e575e12c28609"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.359274 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcpdx\" (UniqueName: \"kubernetes.io/projected/9b00ac21-bd51-4aff-a8fc-d14d2b930940-kube-api-access-fcpdx\") pod \"kube-storage-version-migrator-operator-b67b599dd-8j59v\" (UID: \"9b00ac21-bd51-4aff-a8fc-d14d2b930940\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.361425 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mm5v\" (UniqueName: \"kubernetes.io/projected/e5beee30-fb62-4d40-91bd-c2f4b1efab1f-kube-api-access-5mm5v\") pod \"machine-config-server-8m25n\" (UID: \"e5beee30-fb62-4d40-91bd-c2f4b1efab1f\") " pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.364609 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98k68\" (UniqueName: \"kubernetes.io/projected/7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d-kube-api-access-98k68\") pod \"olm-operator-6b444d44fb-c2x4m\" (UID: \"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.376282 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"87f903fbf801f35514644058e6b9df1eec3cf9b7864d28e34bc2e418327afbf2"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386089 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"84a1f23b1aa1951875f5b0e20abfba39aa7ee93cb5d48a1278af51ebd780e196"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386414 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-ljtrr"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.386689 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ed8d9c15-ca48-46f9-9368-36693ccdbe8b-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-kqwz5\" (UID: \"ed8d9c15-ca48-46f9-9368-36693ccdbe8b\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.402172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerStarted","Data":"f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.402826 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.429573 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"collect-profiles-29564640-mn6rg\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.434789 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" event={"ID":"bee68b29-e3e7-4a15-9bda-981764261dcc","Type":"ContainerStarted","Data":"9cab78c65b3c450b670b6aa33c7bdd5ac404af8d1d3efaf0f8f02962c1a60760"} Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.443682 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.447566 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.448820 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:42.948797576 +0000 UTC m=+207.486992707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.448856 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcq9n\" (UniqueName: \"kubernetes.io/projected/4bf6d6d4-7566-4d1a-acc9-161b6b16f93d-kube-api-access-lcq9n\") pod \"migrator-59844c95c7-bpk8d\" (UID: \"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.459205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.479078 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.481628 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-fpxzh"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.481679 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb"] Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.495166 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.534184 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.543325 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.550747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.555603 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.556336 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.056306008 +0000 UTC m=+207.594501139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.559606 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.563230 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.598401 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-8m25n" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.652504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.652841 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.152824404 +0000 UTC m=+207.691019535 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: W0319 00:10:42.706389 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd804c03_3021_44bd_8ce8_a10a482c59b4.slice/crio-c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1 WatchSource:0}: Error finding container c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1: Status 404 returned error can't find the container with id c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1 Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.754022 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.754477 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.254461171 +0000 UTC m=+207.792656302 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.756603 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-h477l" podStartSLOduration=168.756582891 podStartE2EDuration="2m48.756582891s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:42.709127375 +0000 UTC m=+207.247322506" watchObservedRunningTime="2026-03-19 00:10:42.756582891 +0000 UTC m=+207.294778022" Mar 19 00:10:42 crc kubenswrapper[4745]: W0319 00:10:42.778215 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199f2552_58de_4ea8_adf5_f1aee925f49b.slice/crio-a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642 WatchSource:0}: Error finding container a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642: Status 404 returned error can't find the container with id a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642 Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.847808 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.855031 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.355010621 +0000 UTC m=+207.893205752 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.855066 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.855382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.856095 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.356068417 +0000 UTC m=+207.894263548 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.865503 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" podStartSLOduration=168.865485249 podStartE2EDuration="2m48.865485249s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:42.86399985 +0000 UTC m=+207.402194991" watchObservedRunningTime="2026-03-19 00:10:42.865485249 +0000 UTC m=+207.403680380" Mar 19 00:10:42 crc kubenswrapper[4745]: I0319 00:10:42.958376 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:42 crc kubenswrapper[4745]: E0319 00:10:42.959158 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.45913181 +0000 UTC m=+207.997326941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.061105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.061491 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.56147881 +0000 UTC m=+208.099673931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.163109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.163516 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.663494899 +0000 UTC m=+208.201690040 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.230051 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podStartSLOduration=169.230028809 podStartE2EDuration="2m49.230028809s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.223005196 +0000 UTC m=+207.761200347" watchObservedRunningTime="2026-03-19 00:10:43.230028809 +0000 UTC m=+207.768223960" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.264664 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.265041 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.765028212 +0000 UTC m=+208.303223343 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.288529 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" podStartSLOduration=169.288509922 podStartE2EDuration="2m49.288509922s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.288236873 +0000 UTC m=+207.826432004" watchObservedRunningTime="2026-03-19 00:10:43.288509922 +0000 UTC m=+207.826705053" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.366411 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.367160 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.867144694 +0000 UTC m=+208.405339825 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.435325 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-hg72d" podStartSLOduration=169.435303709 podStartE2EDuration="2m49.435303709s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.370807356 +0000 UTC m=+207.909002487" watchObservedRunningTime="2026-03-19 00:10:43.435303709 +0000 UTC m=+207.973498840" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.483114 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.483415 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:43.983404037 +0000 UTC m=+208.521599168 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.502185 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8m25n" event={"ID":"e5beee30-fb62-4d40-91bd-c2f4b1efab1f","Type":"ContainerStarted","Data":"0fc595965cbda51556e2bcb7121dc07342283b1b7d2cc42a9b005cfc92033228"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.504811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-jrq7v" event={"ID":"5c3bf4d2-ad08-42d0-bd92-f94074fc4833","Type":"ContainerStarted","Data":"b836bf30cafe76b55123bd7ef9a2c3b608c7ecbd64d73eee0d014dac33a102ec"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.518532 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-ssbjs" event={"ID":"053b13b0-078a-45ea-a005-e38aab17b42f","Type":"ContainerStarted","Data":"407356ac1a1db0d3eb06d4b2891c2b80f16c2f7708e3ec79c43354714f8e98f1"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.544054 4745 generic.go:334] "Generic (PLEG): container finished" podID="19580e75-5123-4261-ac5c-96dbd7834613" containerID="13fe52116b9b431e2966593c12d825d4267101d61b43a3ee6a2395733445a5b4" exitCode=0 Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.545024 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerDied","Data":"13fe52116b9b431e2966593c12d825d4267101d61b43a3ee6a2395733445a5b4"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.553371 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"2a61cdef8a7ab2eb222650aab751fecc0381114e26cb753d2ce9e33592c1ea4e"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.591045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"1ba6e1f38cebd60b48f169bf9b16cb68b35cbd4232e7b7482a4e5339486334e0"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.602965 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" event={"ID":"199f2552-58de-4ea8-adf5-f1aee925f49b","Type":"ContainerStarted","Data":"a7e8a76de0a990e17164cd5f8f309b47863340c2c5792ac20c6305ad85bbd642"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.605563 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ljtrr" event={"ID":"38dd3b53-64de-4201-b427-0b1bc3e51849","Type":"ContainerStarted","Data":"06a819ea983bf97ea0acfffb68d054ca1811dc7ddd4281a8c5ae4c1f7282ba1b"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.612061 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.615682 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.11564924 +0000 UTC m=+208.653844381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.626264 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.628180 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.128166736 +0000 UTC m=+208.666361867 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.647917 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerStarted","Data":"6ed318766e28a7953b973ede1884a1baaf2b2983a95f1b27491abc323fc1f4ae"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.698100 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"4e1ecca63a7c08df02a52f04baf81555a44dbe0946e6423e828e3df2518bba61"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.712573 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerStarted","Data":"f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.729838 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.746025 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.24598521 +0000 UTC m=+208.784180341 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.754998 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" event={"ID":"fd804c03-3021-44bd-8ce8-a10a482c59b4","Type":"ContainerStarted","Data":"c44856eba25dffd0a3b3bc92a50fcc693f803312f2a13c32d00a20a42a8a96d1"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.774058 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" event={"ID":"116f15d2-ff67-4a98-846a-29bd6a129bbd","Type":"ContainerStarted","Data":"0b1f554776336ed65fdb0dd1df2acef087f9af61751e65e93f4ff9e5891a6202"} Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.790585 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.849740 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.852084 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.352069664 +0000 UTC m=+208.890264795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.926747 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-445gs" podStartSLOduration=169.926731334 podStartE2EDuration="2m49.926731334s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.926334261 +0000 UTC m=+208.464529402" watchObservedRunningTime="2026-03-19 00:10:43.926731334 +0000 UTC m=+208.464926465" Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.951382 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:43 crc kubenswrapper[4745]: E0319 00:10:43.952216 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.45219836 +0000 UTC m=+208.990393481 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:43 crc kubenswrapper[4745]: I0319 00:10:43.954478 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-pruner-29564640-xrq9h" podStartSLOduration=169.954456866 podStartE2EDuration="2m49.954456866s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:43.952023744 +0000 UTC m=+208.490218875" watchObservedRunningTime="2026-03-19 00:10:43.954456866 +0000 UTC m=+208.492651997" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.019966 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:44 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.020697 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.054055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.054515 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.554502239 +0000 UTC m=+209.092697370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.157521 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.158006 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.657988797 +0000 UTC m=+209.196183928 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.266041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.266632 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.766620035 +0000 UTC m=+209.304815166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.332633 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-t99wg"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.353004 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.353289 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-jrq7v" podStartSLOduration=170.353259543 podStartE2EDuration="2m50.353259543s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.340396046 +0000 UTC m=+208.878591177" watchObservedRunningTime="2026-03-19 00:10:44.353259543 +0000 UTC m=+208.891454674" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.369416 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.369757 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.86972943 +0000 UTC m=+209.407924561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.389156 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-ssbjs" podStartSLOduration=170.389128375 podStartE2EDuration="2m50.389128375s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.385379461 +0000 UTC m=+208.923574592" watchObservedRunningTime="2026-03-19 00:10:44.389128375 +0000 UTC m=+208.927323506" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.401584 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr"] Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.472657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.472992 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:44.97297964 +0000 UTC m=+209.511174761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.574190 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.574872 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.074856075 +0000 UTC m=+209.613051206 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.676702 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.677021 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.177009858 +0000 UTC m=+209.715204989 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.782349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.783249 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.283233167 +0000 UTC m=+209.821428288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.804080 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:44 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:44 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.804128 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.846996 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" event={"ID":"46576b1f-4646-44ba-a896-d509b05801cd","Type":"ContainerStarted","Data":"78985eeb1b6cbf5102aef43d77d961e7298828c776df768c5bf0697baf2864d4"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.857627 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" event={"ID":"19580e75-5123-4261-ac5c-96dbd7834613","Type":"ContainerStarted","Data":"87339a16f298d1a4a72e4572fa9f1ebb981ab0ccb8ac47e278cc8ed00c9086ef"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.858921 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.860142 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" event={"ID":"5e57c8e3-d9a0-42bc-98d3-336656039e9c","Type":"ContainerStarted","Data":"904d5df0701de0ed53d5eb62af3fa4561c590e320e15e70fe544c6dd7e0f874d"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.861126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" event={"ID":"fd804c03-3021-44bd-8ce8-a10a482c59b4","Type":"ContainerStarted","Data":"b64204c9abae73978603b4fbd4fb335833fc1e0d343bc43acc9a2faa6daab5ad"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.863765 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.866731 4745 patch_prober.go:28] interesting pod/console-operator-58897d9998-fpxzh container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.866869 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" podUID="fd804c03-3021-44bd-8ce8-a10a482c59b4" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.23:8443/readyz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.885322 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.889003 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" podStartSLOduration=170.888985431 podStartE2EDuration="2m50.888985431s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.881649196 +0000 UTC m=+209.419844327" watchObservedRunningTime="2026-03-19 00:10:44.888985431 +0000 UTC m=+209.427180562" Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.889537 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.389525239 +0000 UTC m=+209.927720370 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.890403 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" event={"ID":"ab89302b-10a8-43fa-ad93-699274acaac3","Type":"ContainerStarted","Data":"4c72380c747f9ba0747e7bd063d99600527ad18ace1c01c5066ad909b432c333"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.892919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" event={"ID":"199f2552-58de-4ea8-adf5-f1aee925f49b","Type":"ContainerStarted","Data":"b73ccaa8f9fc933966f7bc38763ad254954a0421577b83726cbe7c1e025ed15a"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.896499 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-ljtrr" event={"ID":"38dd3b53-64de-4201-b427-0b1bc3e51849","Type":"ContainerStarted","Data":"4892f51e711725e1250e3b1141628fa13136d81a6a085ea386d07481683c75e7"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.897944 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.900902 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.900935 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.921822 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" podStartSLOduration=170.921806701 podStartE2EDuration="2m50.921806701s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.921515301 +0000 UTC m=+209.459710432" watchObservedRunningTime="2026-03-19 00:10:44.921806701 +0000 UTC m=+209.460001832" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.922330 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" event={"ID":"116f15d2-ff67-4a98-846a-29bd6a129bbd","Type":"ContainerStarted","Data":"cc68f7725174dd71269b9d76da0bec440434f16b38af3efc1226f021a7ab8035"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.950238 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" event={"ID":"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c","Type":"ContainerStarted","Data":"6ef5b654b82c6cb0ee18ace7ea52f6949cfabf5e28bdeb589877987821497a27"} Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.966963 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" podStartSLOduration=170.96694508 podStartE2EDuration="2m50.96694508s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.965590685 +0000 UTC m=+209.503785826" watchObservedRunningTime="2026-03-19 00:10:44.96694508 +0000 UTC m=+209.505140211" Mar 19 00:10:44 crc kubenswrapper[4745]: I0319 00:10:44.985855 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:44 crc kubenswrapper[4745]: E0319 00:10:44.987368 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.487352769 +0000 UTC m=+210.025547900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.004422 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" event={"ID":"a43287ca-c4a1-424a-86b5-f7f4f1a627a9","Type":"ContainerStarted","Data":"0a5dc6241dc9c5f25fea3eba32cc7248c7e5aab6abf8a2997db181e1200f9d0b"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.014305 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerStarted","Data":"5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.014709 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.015979 4745 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-522nc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.017221 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": dial tcp 10.217.0.34:6443: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.038763 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"cb7d6e63f2def1fa1c9273f764b0371b6595ee5b9e0575c36e1c68a6e225e24a"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.038830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" event={"ID":"1cee17e3-a84a-42b3-8cbf-9e4dd2c76330","Type":"ContainerStarted","Data":"46d4bd55d82782d41dccb95fb2c4f3eec6a6684e7d20a824ed68bfabd6ca757a"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.042737 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-74pfb" podStartSLOduration=171.042717087 podStartE2EDuration="2m51.042717087s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:44.992204099 +0000 UTC m=+209.530399230" watchObservedRunningTime="2026-03-19 00:10:45.042717087 +0000 UTC m=+209.580912218" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.070580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.074106 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.082996 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-8xctf" podStartSLOduration=171.082960724 podStartE2EDuration="2m51.082960724s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.042654205 +0000 UTC m=+209.580849336" watchObservedRunningTime="2026-03-19 00:10:45.082960724 +0000 UTC m=+209.621155855" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.084197 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-c8nmg" podStartSLOduration=171.084190985 podStartE2EDuration="2m51.084190985s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.072535288 +0000 UTC m=+209.610730419" watchObservedRunningTime="2026-03-19 00:10:45.084190985 +0000 UTC m=+209.622386116" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.091031 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qn8c4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.091091 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.092507 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.096641 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.596624569 +0000 UTC m=+210.134819700 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.111737 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-8m25n" event={"ID":"e5beee30-fb62-4d40-91bd-c2f4b1efab1f","Type":"ContainerStarted","Data":"35405b1a963706967dee2af7c746b2e7281fcebdd54b9c8264435060fbf73e5d"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.114022 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-ljtrr" podStartSLOduration=171.114005375 podStartE2EDuration="2m51.114005375s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.112394812 +0000 UTC m=+209.650589953" watchObservedRunningTime="2026-03-19 00:10:45.114005375 +0000 UTC m=+209.652200506" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.137759 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podStartSLOduration=171.137732634 podStartE2EDuration="2m51.137732634s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.135597513 +0000 UTC m=+209.673792644" watchObservedRunningTime="2026-03-19 00:10:45.137732634 +0000 UTC m=+209.675927765" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.139806 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"d21fb403e1f1a1258f05dcaeab94eb72788193cd8b1cb8f7a5539cbd0770a04d"} Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.178460 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podStartSLOduration=171.178433535 podStartE2EDuration="2m51.178433535s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.17795724 +0000 UTC m=+209.716152381" watchObservedRunningTime="2026-03-19 00:10:45.178433535 +0000 UTC m=+209.716628666" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.184107 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh"] Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.195688 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.695659528 +0000 UTC m=+210.233854659 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.195826 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.197810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.201015 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.701003895 +0000 UTC m=+210.239199026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.210221 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.234498 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-8m25n" podStartSLOduration=6.234475197 podStartE2EDuration="6.234475197s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.230797186 +0000 UTC m=+209.768992317" watchObservedRunningTime="2026-03-19 00:10:45.234475197 +0000 UTC m=+209.772670318" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.242653 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.254589 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.299654 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.302763 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.802712424 +0000 UTC m=+210.340907555 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.329166 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.329971 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.357998 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.369785 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-kjmjw" podStartSLOduration=171.369758122 podStartE2EDuration="2m51.369758122s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.285362988 +0000 UTC m=+209.823558129" watchObservedRunningTime="2026-03-19 00:10:45.369758122 +0000 UTC m=+209.907953283" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.389426 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m7nb5" podStartSLOduration=171.389396724 podStartE2EDuration="2m51.389396724s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:45.329754413 +0000 UTC m=+209.867949544" watchObservedRunningTime="2026-03-19 00:10:45.389396724 +0000 UTC m=+209.927591855" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.399831 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.402865 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.403250 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:45.903237204 +0000 UTC m=+210.441432335 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.432205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.432276 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.434117 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-54kzj"] Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.440935 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c970483_eb0d_49da_b0bc_d1685f8bf7f1.slice/crio-95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868 WatchSource:0}: Error finding container 95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868: Status 404 returned error can't find the container with id 95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.443862 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-j2mf5"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.451981 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.457604 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.503432 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.508123 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.008091828 +0000 UTC m=+210.546286969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.523388 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6b29d2c_9784_4f10_b0d1_09e88ddf5df1.slice/crio-f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9 WatchSource:0}: Error finding container f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9: Status 404 returned error can't find the container with id f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.610725 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.611292 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.611316 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.111302176 +0000 UTC m=+210.649497307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.611032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.641576 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-m56ls"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.645218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-t28kd"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.703822 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.704566 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.715486 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.715841 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.215827888 +0000 UTC m=+210.754023019 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.716691 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.727948 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-v2dhg"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.729564 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.743366 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.778439 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.790996 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806228 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:45 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:45 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:45 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.806355 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.807934 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq"] Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.817381 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.818587 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.318564711 +0000 UTC m=+210.856759842 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.882472 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75bf4c3d_1ce3_48df_8598_7f72667807c1.slice/crio-6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121 WatchSource:0}: Error finding container 6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121: Status 404 returned error can't find the container with id 6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121 Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.920690 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.921225 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.42118019 +0000 UTC m=+210.959375321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: I0319 00:10:45.921403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:45 crc kubenswrapper[4745]: E0319 00:10:45.921952 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.421943526 +0000 UTC m=+210.960138657 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.946555 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bbc2a79_7f59_45c7_93ac_47ec1e2e7d1d.slice/crio-0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9 WatchSource:0}: Error finding container 0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9: Status 404 returned error can't find the container with id 0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9 Mar 19 00:10:45 crc kubenswrapper[4745]: W0319 00:10:45.993005 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b00ac21_bd51_4aff_a8fc_d14d2b930940.slice/crio-2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8 WatchSource:0}: Error finding container 2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8: Status 404 returned error can't find the container with id 2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8 Mar 19 00:10:46 crc kubenswrapper[4745]: W0319 00:10:45.997051 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b8f7226_6033_4b1b_bd4d_7c045b9d60ef.slice/crio-8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813 WatchSource:0}: Error finding container 8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813: Status 404 returned error can't find the container with id 8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813 Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.005844 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36666: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.022743 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.023050 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.523035304 +0000 UTC m=+211.061230435 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.106609 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36678: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.124526 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.124805 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.624795014 +0000 UTC m=+211.162990145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.207280 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" event={"ID":"ed8d9c15-ca48-46f9-9368-36693ccdbe8b","Type":"ContainerStarted","Data":"a3c320123655c97acc20454e28e5427399334b877aa5a9e39edc365481dccd4d"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.209129 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36690: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.227377 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.228217 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.728200229 +0000 UTC m=+211.266395350 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.241948 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" event={"ID":"cb0a157b-0f6d-4738-ae67-e29407c2ba8e","Type":"ContainerStarted","Data":"de77cbaf251bea6a340e2d1d37523c239f308ef017f3be27325f8a89133255c2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.242007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" event={"ID":"cb0a157b-0f6d-4738-ae67-e29407c2ba8e","Type":"ContainerStarted","Data":"97b3cb068ae934f465f1e3736304152c1974f3e63e6f701fa363428da62c55a9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.253637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"9f05658108a21c1cefe2e8fcfb7c619de85dbc0a51cb6af555e69d419aec2db2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.280259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m56ls" event={"ID":"10e7df4c-01da-48e6-8d5d-20e788cd4cdf","Type":"ContainerStarted","Data":"9a31e96382b131a9852f2dba890059c373f00b9f7061c9235262a2b22a876a3e"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.297821 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36692: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.316043 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" event={"ID":"4394045d-753a-4e2b-8ea5-7087d41481d2","Type":"ContainerStarted","Data":"850635de3db4c88eabd653eb9047ccba37b18499b525639c36aa0c9c9906a67c"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.316116 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" event={"ID":"4394045d-753a-4e2b-8ea5-7087d41481d2","Type":"ContainerStarted","Data":"26cc6c9e72d0535b1f1605a83dd9d618544f1e163201dfa1d9a2581bf0d8dc66"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.317504 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.320318 4745 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-pknnh container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.320369 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" podUID="4394045d-753a-4e2b-8ea5-7087d41481d2" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.40:5443/healthz\": dial tcp 10.217.0.40:5443: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.331254 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.333633 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.833615721 +0000 UTC m=+211.371810852 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.336309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"f5c1e2051cd6dd3e4aa42bb48159f89940ea725af562fa476943499e387b22d9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.341994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerStarted","Data":"158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.369895 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" event={"ID":"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474","Type":"ContainerStarted","Data":"9a5c5a294d1c9b09d1221febdb25357b0f9b2f6929ab871225c58a83f52832a8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.380560 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"8176046f7a0873476e4d347de0183fe72cabaf04ffb1a02980b1382c32e9c813"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.385799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"f790a779b85926d09bc924a556ff58864248e667e250e8a3faf07d58a1b83da8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.392031 4745 patch_prober.go:28] interesting pod/apiserver-76f77b778f-zxjjt container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]log ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]etcd ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/max-in-flight-filter ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 00:10:46 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-startinformers ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 00:10:46 crc kubenswrapper[4745]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 00:10:46 crc kubenswrapper[4745]: livez check failed Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.392080 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" podUID="46576b1f-4646-44ba-a896-d509b05801cd" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.398181 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"c365264ac99ee5d7960d6a1d5f9d985536235be84f6ac3405fad83fa0e5b97e3"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.415330 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36698: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.425268 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" event={"ID":"ffa72fe8-3870-425b-81ff-cb17f6d3ac1c","Type":"ContainerStarted","Data":"fd66c653a2642d275c3084126cee83ac707cbea8285ab428b1180f5004ed82ea"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.432285 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.433698 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:46.933677225 +0000 UTC m=+211.471872366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.459521 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"f827e017aa7525133753ed566b271266d5dbc9f1fe3ddb56d2b7dae84942e48a"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.459564 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" event={"ID":"4c46c928-9116-4548-b20e-d9a66d439012","Type":"ContainerStarted","Data":"f9cd70705364425302817d991fbbb30dbd55eeda27d0d65fcd678e48dfb687db"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.467926 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" event={"ID":"5e57c8e3-d9a0-42bc-98d3-336656039e9c","Type":"ContainerStarted","Data":"53f88b1d56b28535b4c484015019af2e37753d209c01d48cfc93eb31dc1aceb7"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.476335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" event={"ID":"5c970483-eb0d-49da-b0bc-d1685f8bf7f1","Type":"ContainerStarted","Data":"7f3f236625a286867dfa2fcaf4fee062fb38f1a2b848357310a406d1221d43a2"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.476375 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" event={"ID":"5c970483-eb0d-49da-b0bc-d1685f8bf7f1","Type":"ContainerStarted","Data":"95ec3f27792a6b83b095a82fbd8b7f3da2cb814dddd8f101ff3a318d87f29868"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.480117 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"5d0de5e75b92038e268ee9997e6aae27e7a47e5b907366dae742f3b2a496b7ab"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.480158 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"a47f19439c050d08f57fd3da8aaa73609242b18663e951037e2ae69cdfa6cc99"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.488470 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" event={"ID":"9b00ac21-bd51-4aff-a8fc-d14d2b930940","Type":"ContainerStarted","Data":"2bb865cd814bd8f7f1c2aaf9928c012bd72a69d9dd71dcdfdb83958c788f8fc8"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.491785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" event={"ID":"4e2ca532-9e45-44d5-b541-3e9b34352d75","Type":"ContainerStarted","Data":"730a87a80725df33bc6c2202e86ed26d7d609c34a96affbf80d914bfe39df27e"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.492241 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" event={"ID":"4e2ca532-9e45-44d5-b541-3e9b34352d75","Type":"ContainerStarted","Data":"33fe87a1faaaf4734e6c67e461ca817a4f35cd5097926b95a5f6393877824430"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.492840 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.496503 4745 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-wz97d container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.496564 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" podUID="4e2ca532-9e45-44d5-b541-3e9b34352d75" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.30:8443/healthz\": dial tcp 10.217.0.30:8443: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.508359 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"70a4658f51301554466492b39937e1206d8f92491efa2ebb5066085cf5b04349"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.534055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.536626 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.036611885 +0000 UTC m=+211.574807016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.548172 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36706: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.580199 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerStarted","Data":"6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.608785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" event={"ID":"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d","Type":"ContainerStarted","Data":"0b805cdd0ad1515a35bec0557002ce429d01a88da1ef42a23c804d561cd1acf9"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.627172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"fd8c5063bf12a3d1e820f817dcd068e4ac562fab3d742fbe681eeab9bb5a2312"} Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.630544 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.630586 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.631275 4745 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-qn8c4 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" start-of-body= Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.631300 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.31:8080/healthz\": dial tcp 10.217.0.31:8080: connect: connection refused" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.636302 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.636607 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.136590556 +0000 UTC m=+211.674785687 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.654495 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-lpdzv" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.671422 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36722: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.696230 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-fpxzh" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.738918 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.742118 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.242106151 +0000 UTC m=+211.780301282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.790127 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36734: no serving certificate available for the kubelet" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.792959 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:46 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:46 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.799287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.842662 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.843094 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.343064435 +0000 UTC m=+211.881259566 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:46 crc kubenswrapper[4745]: I0319 00:10:46.943607 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:46 crc kubenswrapper[4745]: E0319 00:10:46.944389 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.444378051 +0000 UTC m=+211.982573182 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.046086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.047291 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.547261369 +0000 UTC m=+212.085456500 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.047627 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.047949 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.547942062 +0000 UTC m=+212.086137193 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.078503 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-m56ls" podStartSLOduration=8.078485186 podStartE2EDuration="8.078485186s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.078209437 +0000 UTC m=+211.616404578" watchObservedRunningTime="2026-03-19 00:10:47.078485186 +0000 UTC m=+211.616680317" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.111554 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-5g88t" podStartSLOduration=173.111533894 podStartE2EDuration="2m53.111533894s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.036663297 +0000 UTC m=+211.574858428" watchObservedRunningTime="2026-03-19 00:10:47.111533894 +0000 UTC m=+211.649729025" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.148198 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.148455 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.64844055 +0000 UTC m=+212.186635671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.186139 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-g2gxp" podStartSLOduration=173.186120481 podStartE2EDuration="2m53.186120481s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.132334404 +0000 UTC m=+211.670529535" watchObservedRunningTime="2026-03-19 00:10:47.186120481 +0000 UTC m=+211.724315612" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.229337 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-fztjk" podStartSLOduration=173.229321097 podStartE2EDuration="2m53.229321097s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.227875098 +0000 UTC m=+211.766070259" watchObservedRunningTime="2026-03-19 00:10:47.229321097 +0000 UTC m=+211.767516238" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.229609 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" podStartSLOduration=173.229601526 podStartE2EDuration="2m53.229601526s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.188392547 +0000 UTC m=+211.726587678" watchObservedRunningTime="2026-03-19 00:10:47.229601526 +0000 UTC m=+211.767796667" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.255823 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.256267 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.756250691 +0000 UTC m=+212.294445822 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.352805 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-t99wg" podStartSLOduration=173.352786789 podStartE2EDuration="2m53.352786789s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.304946299 +0000 UTC m=+211.843141450" watchObservedRunningTime="2026-03-19 00:10:47.352786789 +0000 UTC m=+211.890981920" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.357472 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.357797 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.857782494 +0000 UTC m=+212.395977615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.398266 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" podStartSLOduration=173.398250278 podStartE2EDuration="2m53.398250278s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.354476604 +0000 UTC m=+211.892671735" watchObservedRunningTime="2026-03-19 00:10:47.398250278 +0000 UTC m=+211.936445409" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.429571 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-5gsjw" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.438482 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" podStartSLOduration=173.438464204 podStartE2EDuration="2m53.438464204s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.398403793 +0000 UTC m=+211.936598934" watchObservedRunningTime="2026-03-19 00:10:47.438464204 +0000 UTC m=+211.976659335" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.459338 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.459782 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:47.959771222 +0000 UTC m=+212.497966353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.490271 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-z9bmr" podStartSLOduration=173.490251395 podStartE2EDuration="2m53.490251395s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.489410107 +0000 UTC m=+212.027605238" watchObservedRunningTime="2026-03-19 00:10:47.490251395 +0000 UTC m=+212.028446526" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.565248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.565750 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.065735243 +0000 UTC m=+212.603930364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.591344 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36744: no serving certificate available for the kubelet" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.628953 4745 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-522nc container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.629018 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.34:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"233cd4d2d7675e58c02b7c7fcd3b8f3f72ecebe92cdb5bd339ef3320b6064b2a"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" event={"ID":"2b8f7226-6033-4b1b-bd4d-7c045b9d60ef","Type":"ContainerStarted","Data":"7792c7b67854f5d80159c20538facbcb3e17da47a14c28a612bf193658f9e505"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.648831 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.658699 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"120c624cd3e05399f1516d9a371ad97a27d4c570451ddeca5e53a8001229af9f"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.658756 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" event={"ID":"07e3a371-481b-4e06-a2e9-e12f3ec3d28b","Type":"ContainerStarted","Data":"d1a48b356cbda6d6d63f06aa8c59df3b1dec71943a2dafd2adfa31ddf8fe0ae0"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.666333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" event={"ID":"ed8d9c15-ca48-46f9-9368-36693ccdbe8b","Type":"ContainerStarted","Data":"9ffe20c3a8c968c06e01ed2950b65ea7adc4b0b6959b0f17fd566d35bd22f752"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.667154 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.668102 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.168089243 +0000 UTC m=+212.706284374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.684577 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" podStartSLOduration=173.68455915 podStartE2EDuration="2m53.68455915s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.681899581 +0000 UTC m=+212.220094712" watchObservedRunningTime="2026-03-19 00:10:47.68455915 +0000 UTC m=+212.222754281" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.699175 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.699482 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" containerID="cri-o://47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" gracePeriod=30 Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.728464 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"4fb945a1dd06b0eedfc8c31073b82dc3c078b39b4c594868fd0390f0d5038d03"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.732234 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-p29f9" podStartSLOduration=173.732217533 podStartE2EDuration="2m53.732217533s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.729072438 +0000 UTC m=+212.267267579" watchObservedRunningTime="2026-03-19 00:10:47.732217533 +0000 UTC m=+212.270412664" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.744120 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-j9cxt" event={"ID":"0a51c238-7c2c-470f-8123-472327367ec8","Type":"ContainerStarted","Data":"9184b12a3f20fa02533e2f845c578c719786e03937f5ec24372bae1d7dbc1c80"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.765354 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.765544 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" containerID="cri-o://adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" gracePeriod=30 Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.769264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.769402 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.269384698 +0000 UTC m=+212.807579829 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.769571 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.771400 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.271386004 +0000 UTC m=+212.809581135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.772075 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"a51cd4499cb77c80bc2273995247be55c1ea12c65b716cf8ee53249eb4b31c8d"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.774905 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"25ee3192903054722ca0b27799a7cdca6ae03628c87139ac7beb46814f2ce3a1"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.774938 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" event={"ID":"4bf6d6d4-7566-4d1a-acc9-161b6b16f93d","Type":"ContainerStarted","Data":"47cac70244a977c64b07076488a41e24da4931abf01f36a9cb57a257c6591a6d"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.778012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-m56ls" event={"ID":"10e7df4c-01da-48e6-8d5d-20e788cd4cdf","Type":"ContainerStarted","Data":"51b31deae6284da1e0c6544e2eb4fdf3f9ebb6934497896f028fbf5e82ad7df1"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.809849 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" event={"ID":"7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d","Type":"ContainerStarted","Data":"1689e54ac6b38bf6f8e51354e2859d39b0a2257632494bd652cf7dcd045d2767"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.810632 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:47 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:47 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:47 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.810674 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.812661 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.819131 4745 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c2x4m container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.819187 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" podUID="7bbc2a79-7f59-45c7-93ac-47ec1e2e7d1d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.27:8443/healthz\": dial tcp 10.217.0.27:8443: connect: connection refused" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.843743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" event={"ID":"7c88ec95-11d6-4ef9-8ba8-2bf0495e4474","Type":"ContainerStarted","Data":"28132d65e459ce4465574a3af0e9a6857fa66851b653a8259754ea3ada5b1a06"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.864023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"37dae2266302bca7574c610df93b0c5d759818fce7d3068c5e513e570657ed24"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.871154 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.872861 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.372843534 +0000 UTC m=+212.911038665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.875358 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-bpk8d" podStartSLOduration=173.875339667 podStartE2EDuration="2m53.875339667s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.874085375 +0000 UTC m=+212.412280506" watchObservedRunningTime="2026-03-19 00:10:47.875339667 +0000 UTC m=+212.413534798" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.876244 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-kqwz5" podStartSLOduration=173.876235767 podStartE2EDuration="2m53.876235767s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.801492714 +0000 UTC m=+212.339687845" watchObservedRunningTime="2026-03-19 00:10:47.876235767 +0000 UTC m=+212.414430898" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.883317 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerStarted","Data":"9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.884831 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" event={"ID":"9b00ac21-bd51-4aff-a8fc-d14d2b930940","Type":"ContainerStarted","Data":"72e47f42a61f6f960a5acc35595487c91d5d0582bf390852b826e9713883ccb3"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.914821 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"38472aefa56c93474d44ed4ee109273167b40317645d09b3fdee2d2de98038fd"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.914912 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" event={"ID":"c6b29d2c-9784-4f10-b0d1-09e88ddf5df1","Type":"ContainerStarted","Data":"6d43969fa88e73d0ecf4ee37011ecf68c352cb1f985f7c03dfdcc19d11eae93a"} Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.924422 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.924509 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.925559 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" podStartSLOduration=173.925545305 podStartE2EDuration="2m53.925545305s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.924728488 +0000 UTC m=+212.462923629" watchObservedRunningTime="2026-03-19 00:10:47.925545305 +0000 UTC m=+212.463740426" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.927710 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.945800 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-wz97d" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.962631 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-8xwnq" podStartSLOduration=173.962610777 podStartE2EDuration="2m53.962610777s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.960953181 +0000 UTC m=+212.499148332" watchObservedRunningTime="2026-03-19 00:10:47.962610777 +0000 UTC m=+212.500805908" Mar 19 00:10:47 crc kubenswrapper[4745]: I0319 00:10:47.973973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:47 crc kubenswrapper[4745]: E0319 00:10:47.976356 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.476341923 +0000 UTC m=+213.014537054 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.002557 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-j2mf5" podStartSLOduration=174.002535273 podStartE2EDuration="2m54.002535273s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:47.996193322 +0000 UTC m=+212.534388483" watchObservedRunningTime="2026-03-19 00:10:48.002535273 +0000 UTC m=+212.540730404" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.076371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.077038 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.577020617 +0000 UTC m=+213.115215748 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.125413 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" podStartSLOduration=174.125393665 podStartE2EDuration="2m54.125393665s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.075696483 +0000 UTC m=+212.613891614" watchObservedRunningTime="2026-03-19 00:10:48.125393665 +0000 UTC m=+212.663588806" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.178484 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.178831 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.678818139 +0000 UTC m=+213.217013270 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.279778 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.280528 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.780506737 +0000 UTC m=+213.318701868 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.280599 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.281293 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.781281673 +0000 UTC m=+213.319476804 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.382535 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.382673 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.882649891 +0000 UTC m=+213.420845022 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.383041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.383356 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.883343594 +0000 UTC m=+213.421538725 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.395880 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-pknnh" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.455030 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8j59v" podStartSLOduration=174.455015494 podStartE2EDuration="2m54.455015494s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.151005096 +0000 UTC m=+212.689200227" watchObservedRunningTime="2026-03-19 00:10:48.455015494 +0000 UTC m=+212.993210615" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.490426 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.490753 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:48.990739011 +0000 UTC m=+213.528934142 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.498397 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.595055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.595399 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.095387658 +0000 UTC m=+213.633582789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.673135 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.695958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696106 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696128 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696174 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") pod \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\" (UID: \"555c1cf8-c2b3-4e47-9fa9-314a8672b437\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.696266 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.696567 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.196548428 +0000 UTC m=+213.734743559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.697007 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca" (OuterVolumeSpecName: "client-ca") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.697171 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config" (OuterVolumeSpecName: "config") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.705389 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr" (OuterVolumeSpecName: "kube-api-access-866gr") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "kube-api-access-866gr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.709509 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "555c1cf8-c2b3-4e47-9fa9-314a8672b437" (UID: "555c1cf8-c2b3-4e47-9fa9-314a8672b437"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.790513 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:48 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:48 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:48 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.790758 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798439 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798667 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798727 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.798768 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") pod \"16dafcd2-537e-46fe-8028-41bc6ff146a0\" (UID: \"16dafcd2-537e-46fe-8028-41bc6ff146a0\") " Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799061 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/555c1cf8-c2b3-4e47-9fa9-314a8672b437-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799075 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-866gr\" (UniqueName: \"kubernetes.io/projected/555c1cf8-c2b3-4e47-9fa9-314a8672b437-kube-api-access-866gr\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799085 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799093 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/555c1cf8-c2b3-4e47-9fa9-314a8672b437-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.799328 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.299316782 +0000 UTC m=+213.837511913 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799521 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.799830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config" (OuterVolumeSpecName: "config") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.804545 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9" (OuterVolumeSpecName: "kube-api-access-zznl9") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "kube-api-access-zznl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.809333 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "16dafcd2-537e-46fe-8028-41bc6ff146a0" (UID: "16dafcd2-537e-46fe-8028-41bc6ff146a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.900755 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.901181 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.401148615 +0000 UTC m=+213.939343756 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901546 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/16dafcd2-537e-46fe-8028-41bc6ff146a0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901565 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901576 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznl9\" (UniqueName: \"kubernetes.io/projected/16dafcd2-537e-46fe-8028-41bc6ff146a0-kube-api-access-zznl9\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901587 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.901598 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/16dafcd2-537e-46fe-8028-41bc6ff146a0-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:48 crc kubenswrapper[4745]: E0319 00:10:48.901675 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.401659962 +0000 UTC m=+213.939855093 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.929517 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-t28kd" event={"ID":"ead6910f-478b-45b5-b83c-06d3733635cb","Type":"ContainerStarted","Data":"d1fba5ed2707a755db37c09bcae3bafe499cb631e80e4e99585d82257792a8de"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.930474 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934280 4745 generic.go:334] "Generic (PLEG): container finished" podID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" exitCode=0 Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934350 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerDied","Data":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934380 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" event={"ID":"16dafcd2-537e-46fe-8028-41bc6ff146a0","Type":"ContainerDied","Data":"4ccfdcfde3a6b6c854686aad74a4c94d360f5531e573b9fe69f9444980375eba"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934400 4745 scope.go:117] "RemoveContainer" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.934519 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-v4wtx" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.971994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" event={"ID":"11217236-1702-4cd3-b097-e3e410cbbdb4","Type":"ContainerStarted","Data":"5c0c2130c98ce029a089c367beee380b3f3abf95cf3be36859beaf586b5b748c"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.962556 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36754: no serving certificate available for the kubelet" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.980232 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-t28kd" podStartSLOduration=9.980207871 podStartE2EDuration="9.980207871s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:48.964607413 +0000 UTC m=+213.502802564" watchObservedRunningTime="2026-03-19 00:10:48.980207871 +0000 UTC m=+213.518402992" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.992260 4745 generic.go:334] "Generic (PLEG): container finished" podID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" exitCode=0 Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.993595 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.994023 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerDied","Data":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} Mar 19 00:10:48 crc kubenswrapper[4745]: I0319 00:10:48.994111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj" event={"ID":"555c1cf8-c2b3-4e47-9fa9-314a8672b437","Type":"ContainerDied","Data":"4cb2967d4cad71f05028927d26e36d1cb64a836920dd56873c07f0bfdd7a7214"} Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:48.997626 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.001416 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001505 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.001566 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001640 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.001855 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" containerName="route-controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.005586 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" containerName="controller-manager" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008193 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008499 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.008618 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.009126 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.509108801 +0000 UTC m=+214.047303932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.009851 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.010095 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-v4wtx"] Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.010641 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.510628321 +0000 UTC m=+214.048823452 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.011914 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.042506 4745 scope.go:117] "RemoveContainer" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.045516 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": container with ID starting with 47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863 not found: ID does not exist" containerID="47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.045559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863"} err="failed to get container status \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": rpc error: code = NotFound desc = could not find container \"47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863\": container with ID starting with 47b4758f5a68615a172f7cd80ed8157845149d7c53eb44826e9348749db31863 not found: ID does not exist" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.045583 4745 scope.go:117] "RemoveContainer" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.057134 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.066282 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2x4m" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.120732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121136 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.121347 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.122507 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.622493087 +0000 UTC m=+214.160688218 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.142038 4745 scope.go:117] "RemoveContainer" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.150066 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": container with ID starting with adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1 not found: ID does not exist" containerID="adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.150118 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1"} err="failed to get container status \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": rpc error: code = NotFound desc = could not find container \"adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1\": container with ID starting with adf2cc77df1f884beefe84384d7ee072c68533d4c3c5638bd2390ab3062fcdd1 not found: ID does not exist" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.151590 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-v2dhg" podStartSLOduration=175.151575934 podStartE2EDuration="2m55.151575934s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:49.082262462 +0000 UTC m=+213.620457593" watchObservedRunningTime="2026-03-19 00:10:49.151575934 +0000 UTC m=+213.689771065" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.188233 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.205519 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-bjlzj"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.208641 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.217184 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223076 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.223137 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.224528 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.225080 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.725069986 +0000 UTC m=+214.263265107 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.225393 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.237722 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.238866 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.259004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"community-operators-q9zn6\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.324619 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.325092 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.825060147 +0000 UTC m=+214.363255288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325391 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.325449 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.326014 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.826004219 +0000 UTC m=+214.364199350 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.366661 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.384968 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.386336 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.413285 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.426630 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.426974 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427003 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427708 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.427725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.427825 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:49.92780463 +0000 UTC m=+214.465999821 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.460449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"certified-operators-hhfzg\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530203 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530261 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530289 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.530322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.536235 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.036215152 +0000 UTC m=+214.574410353 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.559141 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.578373 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.579513 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.615646 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633467 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633858 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.633885 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.634604 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.634705 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.134686284 +0000 UTC m=+214.672881415 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.634713 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.664182 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"community-operators-kgsn7\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736064 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736120 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736154 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.736730 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.737632 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.237619093 +0000 UTC m=+214.775814224 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.789936 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:49 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:49 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:49 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.790287 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.835411 4745 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838095 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838554 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.838609 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.338565286 +0000 UTC m=+214.876760417 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838903 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.838994 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.839056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.839479 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.339461326 +0000 UTC m=+214.877656457 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.840073 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.841614 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.860037 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"certified-operators-g5dw2\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.871339 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.900718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.926825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:10:49 crc kubenswrapper[4745]: W0319 00:10:49.934529 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc21b8175_025a_4d91_ad43_389dbad40846.slice/crio-44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9 WatchSource:0}: Error finding container 44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9: Status 404 returned error can't find the container with id 44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9 Mar 19 00:10:49 crc kubenswrapper[4745]: I0319 00:10:49.939685 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:49 crc kubenswrapper[4745]: E0319 00:10:49.940014 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.439999935 +0000 UTC m=+214.978195066 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.033714 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.035478 4745 generic.go:334] "Generic (PLEG): container finished" podID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerID="9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321" exitCode=0 Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.035742 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerDied","Data":"9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.055241 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.055818 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.555804663 +0000 UTC m=+215.093999794 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.087352 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"872a2af3b5e410cebc7b95e37c95c913e524a6dae4e0a087b8b1a382f2544e63"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.096575 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerStarted","Data":"1385db0e9218cd6a53bd844f3c99f4797ac5eed30c3c8451117c54ef69a818d9"} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.147956 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16dafcd2-537e-46fe-8028-41bc6ff146a0" path="/var/lib/kubelet/pods/16dafcd2-537e-46fe-8028-41bc6ff146a0/volumes" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.152770 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555c1cf8-c2b3-4e47-9fa9-314a8672b437" path="/var/lib/kubelet/pods/555c1cf8-c2b3-4e47-9fa9-314a8672b437/volumes" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.156276 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.156634 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.656606262 +0000 UTC m=+215.194801393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.156711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.157151 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.657137489 +0000 UTC m=+215.195332620 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.219800 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.257926 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.258475 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.758453505 +0000 UTC m=+215.296648646 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.258501 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.259486 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.261121 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.264278 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.764257788 +0000 UTC m=+215.302453009 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266211 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266826 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.266985 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.267106 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.276138 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.277307 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282357 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282651 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282688 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.282962 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.283021 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.284993 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.285226 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.286432 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300314 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300373 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.300387 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.328353 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:50 crc kubenswrapper[4745]: W0319 00:10:50.334604 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09d29a41_94df_42b0_b7d3_6b47b06a238f.slice/crio-9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb WatchSource:0}: Error finding container 9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb: Status 404 returned error can't find the container with id 9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.341256 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.342200 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.346480 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.350213 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.350351 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.353355 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-zxjjt" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.362174 4745 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T00:10:49.835440833Z","Handler":null,"Name":""} Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364132 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364411 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.364480 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.364691 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.864651033 +0000 UTC m=+215.402846154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365351 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365401 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365519 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365545 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365563 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.365582 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: E0319 00:10:50.365917 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 00:10:50.865901314 +0000 UTC m=+215.404096445 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ggt62" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.383810 4745 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.383848 4745 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469416 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469826 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469856 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469931 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469970 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.469987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470001 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470020 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470042 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.470150 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.482259 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.488088 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.514233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.536800 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.537342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.537760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.538224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.550378 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.550490 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"route-controller-manager-5d896c4bf4-rssdk\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575046 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575487 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575523 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.575966 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.579061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"controller-manager-cb9f8c68d-vtnjf\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.585042 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.585091 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.615304 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.615601 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.641593 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.699286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ggt62\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.733612 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.852402 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:50 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:50 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:50 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.852462 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.863611 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:50 crc kubenswrapper[4745]: I0319 00:10:50.961233 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.091909 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.109637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerStarted","Data":"61b844b715e359e6d46bde055bd1277533b2b0f0ca8e92400f5004133e860078"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.114116 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.114189 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118391 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118458 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.118482 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerStarted","Data":"9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.144350 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.144971 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.161083 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.163871 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"7448deb36fe99ac300f92c9645b927ee7a1f8f2550139c1903a5fa47745379cf"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.169034 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" event={"ID":"b7b46d81-2c53-4021-be8a-f962c576a94c","Type":"ContainerStarted","Data":"1b0420993225746b8972ede8e7be35907c73573aa37a33cea54d5d2273ae112d"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.182734 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" exitCode=0 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.183043 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.183070 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerStarted","Data":"5cf5d6c8b2c76c4c80fde1c17a6532692631c387bb1a224bdbe1f73591bc68b3"} Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.195825 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.209185 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.209366 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.213464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.217736 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-54kzj" podStartSLOduration=12.21770309 podStartE2EDuration="12.21770309s" podCreationTimestamp="2026-03-19 00:10:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:51.213512232 +0000 UTC m=+215.751707383" watchObservedRunningTime="2026-03-19 00:10:51.21770309 +0000 UTC m=+215.755898221" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.236718 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:10:51 crc kubenswrapper[4745]: W0319 00:10:51.261900 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4be3ad0_fb20_41e9_9aaf_e55d86cdd1bb.slice/crio-2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649 WatchSource:0}: Error finding container 2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649: Status 404 returned error can't find the container with id 2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649 Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.287524 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.368583 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.369584 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.379723 4745 patch_prober.go:28] interesting pod/console-f9d7485db-ssbjs container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.379785 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-ssbjs" podUID="053b13b0-078a-45ea-a005-e38aab17b42f" containerName="console" probeResult="failure" output="Get \"https://10.217.0.11:8443/health\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390436 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390515 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.390571 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.391141 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.391208 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.418410 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"redhat-marketplace-mtjq5\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.474909 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.533108 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.541108 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.582712 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36764: no serving certificate available for the kubelet" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595528 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.595684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") pod \"75bf4c3d-1ce3-48df-8598-7f72667807c1\" (UID: \"75bf4c3d-1ce3-48df-8598-7f72667807c1\") " Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.597558 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:51 crc kubenswrapper[4745]: E0319 00:10:51.597844 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.598166 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.598404 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" containerName="collect-profiles" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.601846 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.597564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume" (OuterVolumeSpecName: "config-volume") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.614069 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.614261 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s" (OuterVolumeSpecName: "kube-api-access-wwv7s") pod "75bf4c3d-1ce3-48df-8598-7f72667807c1" (UID: "75bf4c3d-1ce3-48df-8598-7f72667807c1"). InnerVolumeSpecName "kube-api-access-wwv7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.623742 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686711 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686755 4745 patch_prober.go:28] interesting pod/downloads-7954f5f757-ljtrr container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686795 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.686830 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-ljtrr" podUID="38dd3b53-64de-4201-b427-0b1bc3e51849" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697364 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697447 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/75bf4c3d-1ce3-48df-8598-7f72667807c1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697463 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/75bf4c3d-1ce3-48df-8598-7f72667807c1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.697476 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wwv7s\" (UniqueName: \"kubernetes.io/projected/75bf4c3d-1ce3-48df-8598-7f72667807c1-kube-api-access-wwv7s\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.788295 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.794857 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:51 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:51 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:51 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.794998 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808611 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808666 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.808741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.809362 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.809908 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.834472 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"redhat-marketplace-wf9ss\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:51 crc kubenswrapper[4745]: I0319 00:10:51.933816 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.009685 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:10:52 crc kubenswrapper[4745]: W0319 00:10:52.031843 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3c406d_9994_4629_b585_4d145b1e04aa.slice/crio-3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176 WatchSource:0}: Error finding container 3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176: Status 404 returned error can't find the container with id 3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176 Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.165257 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.214318 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.231659 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.256367 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.263838 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319251 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319398 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.319481 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348538 4745 generic.go:334] "Generic (PLEG): container finished" podID="6368460d-1bb9-4315-9730-1cf1673361fe" containerID="5b74ce15b598bc3286b90032d7a9c3ca2e9d7505c2ee3febadcd909eeca62c01" exitCode=0 Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348662 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerDied","Data":"5b74ce15b598bc3286b90032d7a9c3ca2e9d7505c2ee3febadcd909eeca62c01"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.348702 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerStarted","Data":"83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.368901 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerStarted","Data":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.368945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerStarted","Data":"de743d9871d9e519d057ac18763fdb10aeceb0154e79a293ccaea85445d780d3"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.369908 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420507 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420553 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.420629 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.421480 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.421789 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425536 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" event={"ID":"75bf4c3d-1ce3-48df-8598-7f72667807c1","Type":"ContainerDied","Data":"6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425587 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c709a0699c6d486d1e3770457647591e90dfba805ee2d7741ba1d2f696a5121" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.425671 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.477479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"redhat-operators-75vmv\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.490088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerStarted","Data":"6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.490501 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.496324 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.509514 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerStarted","Data":"3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.538553 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" podStartSLOduration=178.538531238 podStartE2EDuration="2m58.538531238s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.436004772 +0000 UTC m=+216.974199903" watchObservedRunningTime="2026-03-19 00:10:52.538531238 +0000 UTC m=+217.076726369" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.539036 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podStartSLOduration=4.539019764 podStartE2EDuration="4.539019764s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.537477373 +0000 UTC m=+217.075672524" watchObservedRunningTime="2026-03-19 00:10:52.539019764 +0000 UTC m=+217.077214905" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.577411 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerStarted","Data":"b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.577546 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerStarted","Data":"2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649"} Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.581806 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.582682 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.586643 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.655503 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.658399 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" podStartSLOduration=4.658376479 podStartE2EDuration="4.658376479s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:52.623029905 +0000 UTC m=+217.161225066" watchObservedRunningTime="2026-03-19 00:10:52.658376479 +0000 UTC m=+217.196571610" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.725618 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.725973 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.726146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.760928 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.801240 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:52 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:52 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:52 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.801293 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:52 crc kubenswrapper[4745]: W0319 00:10:52.807036 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod04cc89b5_7bac_4b91_bb97_a1f5ab14260c.slice/crio-9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac WatchSource:0}: Error finding container 9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac: Status 404 returned error can't find the container with id 9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828235 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828312 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828911 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.828935 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:52 crc kubenswrapper[4745]: I0319 00:10:52.852207 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"redhat-operators-cgghw\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.058783 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.077520 4745 ???:1] "http: TLS handshake error from 192.168.126.11:36776: no serving certificate available for the kubelet" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.136938 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137020 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137153 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.137180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.138130 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.141730 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145035 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7-metrics-certs\") pod \"network-metrics-daemon-4r5k5\" (UID: \"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7\") " pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145262 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.145692 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.161762 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.184971 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.354782 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.365942 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.379677 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.392873 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4r5k5" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.610561 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:10:53 crc kubenswrapper[4745]: W0319 00:10:53.620744 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb19d4fad_672f_40f3_bfdb_53b36da06399.slice/crio-a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880 WatchSource:0}: Error finding container a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880: Status 404 returned error can't find the container with id a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643437 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" exitCode=0 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643590 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.643642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerStarted","Data":"9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.652921 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" exitCode=0 Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.653068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.673955 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.675085 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.675105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"2cd3908399145e2519a565664cfaff071ac8bf459660c66c4bd6a1d4b7d2532a"} Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.689619 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.747371 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.748376 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.750282 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.763773 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.772654 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.792024 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:53 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:53 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:53 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.792087 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.873468 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.873559 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974780 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974880 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:53 crc kubenswrapper[4745]: I0319 00:10:53.974994 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.012156 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.117969 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.182748 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4r5k5"] Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.282058 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: W0319 00:10:54.361477 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be WatchSource:0}: Error finding container 5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be: Status 404 returned error can't find the container with id 5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.381567 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") pod \"6368460d-1bb9-4315-9730-1cf1673361fe\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.381648 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") pod \"6368460d-1bb9-4315-9730-1cf1673361fe\" (UID: \"6368460d-1bb9-4315-9730-1cf1673361fe\") " Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.382084 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6368460d-1bb9-4315-9730-1cf1673361fe" (UID: "6368460d-1bb9-4315-9730-1cf1673361fe"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.406136 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6368460d-1bb9-4315-9730-1cf1673361fe" (UID: "6368460d-1bb9-4315-9730-1cf1673361fe"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.484019 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6368460d-1bb9-4315-9730-1cf1673361fe-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.484074 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6368460d-1bb9-4315-9730-1cf1673361fe-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.773601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"5c58eab5939e49bc5a07ab38cb0867da191941123050c5f66713acc92d6977be"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.782934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ace90e2b52fc2f5e41ef9ec72047f4210de51cf80ccfd73296dd54a80d3ad814"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.791772 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:54 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:54 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:54 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.791936 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808513 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"6368460d-1bb9-4315-9730-1cf1673361fe","Type":"ContainerDied","Data":"83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808566 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83e03c767a24ab1198cfb801009d43bc183be0d17f5e9bd4c865466aea4b9acd" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.808672 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.834325 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" exitCode=0 Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.834473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.859060 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"ed2fc26656f61bb585323ed6fc040e749d0c185124a74d28a5d612ff3a8ded9c"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.867628 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"af81edb17dda05bfeff969a5069a4a4f9214d26fc97643a708c50e839c9089fd"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.877754 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} Mar 19 00:10:54 crc kubenswrapper[4745]: I0319 00:10:54.878103 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.154306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 00:10:55 crc kubenswrapper[4745]: W0319 00:10:55.194569 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod10ad2429_ed3b_4688_8ee9_361a2ea56579.slice/crio-142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646 WatchSource:0}: Error finding container 142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646: Status 404 returned error can't find the container with id 142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646 Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.803096 4745 patch_prober.go:28] interesting pod/router-default-5444994796-jrq7v container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 00:10:55 crc kubenswrapper[4745]: [-]has-synced failed: reason withheld Mar 19 00:10:55 crc kubenswrapper[4745]: [+]process-running ok Mar 19 00:10:55 crc kubenswrapper[4745]: healthz check failed Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.803705 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-jrq7v" podUID="5c3bf4d2-ad08-42d0-bd92-f94074fc4833" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.896201 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"5bba45c36d6c7e4eeb534a2f79a4a410cf1d990596123b87933c698abdf5ab44"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.905785 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"ef991f8cac41319612cb2cf2e0b4460c10fc4ce79aa2c3c97624c2bf620b9b17"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.906991 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.909726 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" exitCode=0 Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.909799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.916014 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0d351bba75fabd782d641353de41feb49f9783105b3fb6ab72b2760acb6d10d1"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.917593 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerStarted","Data":"142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646"} Mar 19 00:10:55 crc kubenswrapper[4745]: I0319 00:10:55.921642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"71f012d5eab7f1600b917d7ba192a81649e4aaa214f494b6d6d830b83eec1ab3"} Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.731644 4745 ???:1] "http: TLS handshake error from 192.168.126.11:47450: no serving certificate available for the kubelet" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.796858 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.800909 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-jrq7v" Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.941575 4745 generic.go:334] "Generic (PLEG): container finished" podID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerID="62c1b52c2f117491bd97c470bf6b3b0a2a679c1e20ac5a146452b907ba153571" exitCode=0 Mar 19 00:10:56 crc kubenswrapper[4745]: I0319 00:10:56.941752 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerDied","Data":"62c1b52c2f117491bd97c470bf6b3b0a2a679c1e20ac5a146452b907ba153571"} Mar 19 00:10:57 crc kubenswrapper[4745]: I0319 00:10:57.325476 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-t28kd" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.039549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4r5k5" event={"ID":"5fb0b1fb-7b2d-48e8-8db5-32fc94ea3bd7","Type":"ContainerStarted","Data":"d0e57813f5b03e4ca579c832c1ba05dc0c0e653d2d1c5d9ead50bb120b0518a1"} Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.057089 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4r5k5" podStartSLOduration=184.057065514 podStartE2EDuration="3m4.057065514s" podCreationTimestamp="2026-03-19 00:07:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:10:58.055029966 +0000 UTC m=+222.593225097" watchObservedRunningTime="2026-03-19 00:10:58.057065514 +0000 UTC m=+222.595260645" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.568985 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") pod \"10ad2429-ed3b-4688-8ee9-361a2ea56579\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592208 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") pod \"10ad2429-ed3b-4688-8ee9-361a2ea56579\" (UID: \"10ad2429-ed3b-4688-8ee9-361a2ea56579\") " Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.592588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "10ad2429-ed3b-4688-8ee9-361a2ea56579" (UID: "10ad2429-ed3b-4688-8ee9-361a2ea56579"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.602196 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "10ad2429-ed3b-4688-8ee9-361a2ea56579" (UID: "10ad2429-ed3b-4688-8ee9-361a2ea56579"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.697648 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/10ad2429-ed3b-4688-8ee9-361a2ea56579-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:58 crc kubenswrapper[4745]: I0319 00:10:58.698240 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/10ad2429-ed3b-4688-8ee9-361a2ea56579-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082431 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082423 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"10ad2429-ed3b-4688-8ee9-361a2ea56579","Type":"ContainerDied","Data":"142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646"} Mar 19 00:10:59 crc kubenswrapper[4745]: I0319 00:10:59.082508 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="142ff4f106a756ab212d227b3fb4b768b296da3b66956a4bcc02aa0706b27646" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.374478 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.380336 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-ssbjs" Mar 19 00:11:01 crc kubenswrapper[4745]: I0319 00:11:01.691205 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-ljtrr" Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.941023 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.941766 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" containerID="cri-o://b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" gracePeriod=30 Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.960038 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:06 crc kubenswrapper[4745]: I0319 00:11:06.960249 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" containerID="cri-o://6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" gracePeriod=30 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.165958 4745 generic.go:334] "Generic (PLEG): container finished" podID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerID="6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" exitCode=0 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.166047 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerDied","Data":"6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a"} Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.168414 4745 generic.go:334] "Generic (PLEG): container finished" podID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerID="b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" exitCode=0 Mar 19 00:11:08 crc kubenswrapper[4745]: I0319 00:11:08.168445 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerDied","Data":"b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d"} Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.513262 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549123 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549513 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549540 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549552 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549566 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: E0319 00:11:09.549582 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549593 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549797 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="10ad2429-ed3b-4688-8ee9-361a2ea56579" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549821 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" containerName="controller-manager" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.549835 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6368460d-1bb9-4315-9730-1cf1673361fe" containerName="pruner" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.550598 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.553706 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570372 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570457 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570538 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.570629 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") pod \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\" (UID: \"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb\") " Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572018 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config" (OuterVolumeSpecName: "config") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572536 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.572905 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca" (OuterVolumeSpecName: "client-ca") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.581589 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz" (OuterVolumeSpecName: "kube-api-access-nsqlz") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "kube-api-access-nsqlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.581606 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" (UID: "b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.671898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672538 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672675 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.672961 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673019 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nsqlz\" (UniqueName: \"kubernetes.io/projected/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-kube-api-access-nsqlz\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673046 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673069 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.673092 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.773891 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.773947 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774037 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774066 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.774097 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.775616 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.775901 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.776495 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.782740 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.789806 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"controller-manager-66fbb79cf5-mhgjw\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:09 crc kubenswrapper[4745]: I0319 00:11:09.874819 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" event={"ID":"b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb","Type":"ContainerDied","Data":"2b9a54ac81f1adc06f169354318daf4a47687d40778d5e79511d9060b89e1649"} Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179194 4745 scope.go:117] "RemoveContainer" containerID="b7db85026fab40c5726ccf8e6eceef62a6eb94ba00083b9af9f1d5aa94f1456d" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.179216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.195094 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.197736 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-cb9f8c68d-vtnjf"] Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.616556 4745 patch_prober.go:28] interesting pod/route-controller-manager-5d896c4bf4-rssdk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.616624 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 19 00:11:10 crc kubenswrapper[4745]: I0319 00:11:10.869334 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:11:12 crc kubenswrapper[4745]: I0319 00:11:12.145655 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb" path="/var/lib/kubelet/pods/b4be3ad0-fb20-41e9-9aaf-e55d86cdd1bb/volumes" Mar 19 00:11:15 crc kubenswrapper[4745]: I0319 00:11:15.606052 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:11:15 crc kubenswrapper[4745]: I0319 00:11:15.606346 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:11:17 crc kubenswrapper[4745]: I0319 00:11:17.238426 4745 ???:1] "http: TLS handshake error from 192.168.126.11:47750: no serving certificate available for the kubelet" Mar 19 00:11:18 crc kubenswrapper[4745]: I0319 00:11:18.220350 4745 generic.go:334] "Generic (PLEG): container finished" podID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerID="f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a" exitCode=0 Mar 19 00:11:18 crc kubenswrapper[4745]: I0319 00:11:18.220426 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerDied","Data":"f0c4e6e413094127a3df81fb923d127a9cc66b65e1e4d1cf289b3133f8a3d81a"} Mar 19 00:11:21 crc kubenswrapper[4745]: I0319 00:11:21.617206 4745 patch_prober.go:28] interesting pod/route-controller-manager-5d896c4bf4-rssdk container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:11:21 crc kubenswrapper[4745]: I0319 00:11:21.617288 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.233803 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-q4x24" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.328655 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.330379 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.334595 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.335173 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.335300 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.457175 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.457538 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559240 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559541 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.559355 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.596504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.609741 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.609920 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 00:11:22 crc kubenswrapper[4745]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 00:11:22 crc kubenswrapper[4745]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c5kr4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29564650-7k6ld_openshift-infra(d14d18f5-0177-4458-8ea3-b266cc96d658): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 00:11:22 crc kubenswrapper[4745]: > logger="UnhandledError" Mar 19 00:11:22 crc kubenswrapper[4745]: E0319 00:11:22.611069 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" Mar 19 00:11:22 crc kubenswrapper[4745]: I0319 00:11:22.651202 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.207285 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.216741 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253069 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" event={"ID":"4dac052a-7e93-4343-901a-6b0cfb885cc4","Type":"ContainerDied","Data":"61b844b715e359e6d46bde055bd1277533b2b0f0ca8e92400f5004133e860078"} Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253331 4745 scope.go:117] "RemoveContainer" containerID="6541d510764cf79da71cf58a66703eed8f6a428fb89fbea76cd701f41b751e9a" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.253416 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-pruner-29564640-xrq9h" event={"ID":"88004414-de81-4e3c-9f3f-99f90a3bbc98","Type":"ContainerDied","Data":"f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d"} Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260750 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-pruner-29564640-xrq9h" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.260767 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f164174bebded768a6d9e6df2a3ae9216824193f2bf4ab662809fd13bd0bfb0d" Mar 19 00:11:23 crc kubenswrapper[4745]: E0319 00:11:23.261920 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267559 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") pod \"88004414-de81-4e3c-9f3f-99f90a3bbc98\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267633 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267737 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267756 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") pod \"88004414-de81-4e3c-9f3f-99f90a3bbc98\" (UID: \"88004414-de81-4e3c-9f3f-99f90a3bbc98\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.267812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") pod \"4dac052a-7e93-4343-901a-6b0cfb885cc4\" (UID: \"4dac052a-7e93-4343-901a-6b0cfb885cc4\") " Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.268442 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca" (OuterVolumeSpecName: "serviceca") pod "88004414-de81-4e3c-9f3f-99f90a3bbc98" (UID: "88004414-de81-4e3c-9f3f-99f90a3bbc98"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.268643 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config" (OuterVolumeSpecName: "config") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.269149 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca" (OuterVolumeSpecName: "client-ca") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.274017 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx" (OuterVolumeSpecName: "kube-api-access-lnckx") pod "88004414-de81-4e3c-9f3f-99f90a3bbc98" (UID: "88004414-de81-4e3c-9f3f-99f90a3bbc98"). InnerVolumeSpecName "kube-api-access-lnckx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.274071 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.282781 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6" (OuterVolumeSpecName: "kube-api-access-g62l6") pod "4dac052a-7e93-4343-901a-6b0cfb885cc4" (UID: "4dac052a-7e93-4343-901a-6b0cfb885cc4"). InnerVolumeSpecName "kube-api-access-g62l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.369933 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.369994 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lnckx\" (UniqueName: \"kubernetes.io/projected/88004414-de81-4e3c-9f3f-99f90a3bbc98-kube-api-access-lnckx\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370009 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4dac052a-7e93-4343-901a-6b0cfb885cc4-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370019 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4dac052a-7e93-4343-901a-6b0cfb885cc4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370029 4745 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/88004414-de81-4e3c-9f3f-99f90a3bbc98-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.370038 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g62l6\" (UniqueName: \"kubernetes.io/projected/4dac052a-7e93-4343-901a-6b0cfb885cc4-kube-api-access-g62l6\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.582293 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:23 crc kubenswrapper[4745]: I0319 00:11:23.585660 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d896c4bf4-rssdk"] Mar 19 00:11:24 crc kubenswrapper[4745]: I0319 00:11:24.148387 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" path="/var/lib/kubelet/pods/4dac052a-7e93-4343-901a-6b0cfb885cc4/volumes" Mar 19 00:11:26 crc kubenswrapper[4745]: I0319 00:11:26.957541 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076241 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.076467 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076480 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.076493 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076500 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076618 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dac052a-7e93-4343-901a-6b0cfb885cc4" containerName="route-controller-manager" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.076631 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="88004414-de81-4e3c-9f3f-99f90a3bbc98" containerName="image-pruner" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.077045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.080035 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.080873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.081195 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.082723 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.084255 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.085840 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.129959 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130249 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.130303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231704 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231753 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231784 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.231810 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.233383 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.233453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.241812 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.249036 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"route-controller-manager-65b7fbb54-twj8q\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.393830 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.596461 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.596982 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rkp49,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-kgsn7_openshift-marketplace(09d29a41-94df-42b0-b7d3-6b47b06a238f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:27 crc kubenswrapper[4745]: E0319 00:11:27.598447 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.919266 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.924329 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:27 crc kubenswrapper[4745]: I0319 00:11:27.925228 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046380 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.046441 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147229 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147288 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147421 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.147453 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.172210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"installer-9-crc\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:28 crc kubenswrapper[4745]: I0319 00:11:28.248148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.256649 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.355695 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.356146 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rttcj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-75vmv_openshift-marketplace(71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.357672 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.360015 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.360130 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdqmk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-q9zn6_openshift-marketplace(c21b8175-025a-4d91-ad43-389dbad40846): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:31 crc kubenswrapper[4745]: E0319 00:11:31.361346 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.670413 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.672634 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.756075 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.756556 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sqj2f,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-mtjq5_openshift-marketplace(2c3c406d-9994-4629-b585-4d145b1e04aa): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.757859 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.761086 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.761240 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8sq28,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wf9ss_openshift-marketplace(04cc89b5-7bac-4b91-bb97-a1f5ab14260c): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.762814 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.778076 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.778235 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mdzp9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cgghw_openshift-marketplace(b19d4fad-672f-40f3-bfdb-53b36da06399): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:32 crc kubenswrapper[4745]: E0319 00:11:32.779619 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" Mar 19 00:11:33 crc kubenswrapper[4745]: I0319 00:11:33.043561 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:33 crc kubenswrapper[4745]: I0319 00:11:33.361011 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.361854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.361961 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.362020 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.394315 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod98407a3b_8601_4632_b5b0_9308cfe2dbb6.slice/crio-9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3 WatchSource:0}: Error finding container 9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3: Status 404 returned error can't find the container with id 9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3 Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.468245 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.468522 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-82wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-hhfzg_openshift-marketplace(0c1d22d3-b584-4622-856c-b531a5d1ad5d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.469691 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.499233 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.499365 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-695vg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-g5dw2_openshift-marketplace(ee0bf814-e571-41fe-9265-b77d8b53e20f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:11:34 crc kubenswrapper[4745]: E0319 00:11:34.500925 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.572551 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.828507 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 00:11:34 crc kubenswrapper[4745]: I0319 00:11:34.837304 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.837457 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcb131d37_4be2_4843_9ed6_21fc0636b07f.slice/crio-0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7 WatchSource:0}: Error finding container 0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7: Status 404 returned error can't find the container with id 0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7 Mar 19 00:11:34 crc kubenswrapper[4745]: W0319 00:11:34.839993 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c10585a_574b_4a55_8b88_9997418b9e02.slice/crio-77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6 WatchSource:0}: Error finding container 77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6: Status 404 returned error can't find the container with id 77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6 Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.320443 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerStarted","Data":"77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.321606 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerStarted","Data":"0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324082 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerStarted","Data":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324342 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" containerID="cri-o://b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" gracePeriod=30 Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324346 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerStarted","Data":"9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.324553 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.328347 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerStarted","Data":"0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0"} Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.328387 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerStarted","Data":"d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae"} Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.345286 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.345464 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.348679 4745 patch_prober.go:28] interesting pod/controller-manager-66fbb79cf5-mhgjw container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:51696->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.349504 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:51696->10.217.0.58:8443: read: connection reset by peer" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.358070 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" podStartSLOduration=29.358043924 podStartE2EDuration="29.358043924s" podCreationTimestamp="2026-03-19 00:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:35.351775206 +0000 UTC m=+259.889970347" watchObservedRunningTime="2026-03-19 00:11:35.358043924 +0000 UTC m=+259.896239055" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.413273 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=13.413254618 podStartE2EDuration="13.413254618s" podCreationTimestamp="2026-03-19 00:11:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:35.393078888 +0000 UTC m=+259.931274029" watchObservedRunningTime="2026-03-19 00:11:35.413254618 +0000 UTC m=+259.951449749" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.694931 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713052 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713105 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713154 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713180 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.713213 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") pod \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\" (UID: \"98407a3b-8601-4632-b5b0-9308cfe2dbb6\") " Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714018 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714085 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config" (OuterVolumeSpecName: "config") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.714562 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.723560 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc" (OuterVolumeSpecName: "kube-api-access-xp7vc") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "kube-api-access-xp7vc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.727108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "98407a3b-8601-4632-b5b0-9308cfe2dbb6" (UID: "98407a3b-8601-4632-b5b0-9308cfe2dbb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730512 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:35 crc kubenswrapper[4745]: E0319 00:11:35.730790 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730807 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.730947 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerName="controller-manager" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.731314 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.746749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814393 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814463 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814667 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814733 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814910 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98407a3b-8601-4632-b5b0-9308cfe2dbb6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814925 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814936 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814946 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/98407a3b-8601-4632-b5b0-9308cfe2dbb6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.814956 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xp7vc\" (UniqueName: \"kubernetes.io/projected/98407a3b-8601-4632-b5b0-9308cfe2dbb6-kube-api-access-xp7vc\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916284 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916332 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916358 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916417 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.916444 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.917716 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.917951 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.918265 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.921267 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:35 crc kubenswrapper[4745]: I0319 00:11:35.934403 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"controller-manager-6c4857fb46-nxt5t\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.086081 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.267737 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.336601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerStarted","Data":"58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.336981 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.337710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerStarted","Data":"4df8c028721f86f64f802110e34a6846149248c20cfe8d5077d6d03475ad3327"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.341744 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerStarted","Data":"e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343792 4745 generic.go:334] "Generic (PLEG): container finished" podID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" exitCode=0 Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343845 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343899 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerDied","Data":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343957 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw" event={"ID":"98407a3b-8601-4632-b5b0-9308cfe2dbb6","Type":"ContainerDied","Data":"9f0c65e6b1640a3ecd576682733bb4dc9007cf419c28d600a351e0baa6eb0fc3"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.343977 4745 scope.go:117] "RemoveContainer" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.344816 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.347119 4745 generic.go:334] "Generic (PLEG): container finished" podID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerID="0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0" exitCode=0 Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.347163 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerDied","Data":"0f87fd30d76b53dcac185b3503fb308808afdfd323c801d8401fbd4b0ed01bc0"} Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.358380 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podStartSLOduration=9.358361425 podStartE2EDuration="9.358361425s" podCreationTimestamp="2026-03-19 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:36.354366501 +0000 UTC m=+260.892561632" watchObservedRunningTime="2026-03-19 00:11:36.358361425 +0000 UTC m=+260.896556566" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.363805 4745 scope.go:117] "RemoveContainer" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: E0319 00:11:36.364166 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": container with ID starting with b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612 not found: ID does not exist" containerID="b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.364211 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612"} err="failed to get container status \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": rpc error: code = NotFound desc = could not find container \"b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612\": container with ID starting with b95b361c3c16a9bb5a84e5527f327f4677ddd3abda4316dc9853a7f003e9d612 not found: ID does not exist" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.408778 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.408750848 podStartE2EDuration="9.408750848s" podCreationTimestamp="2026-03-19 00:11:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:36.406827735 +0000 UTC m=+260.945022866" watchObservedRunningTime="2026-03-19 00:11:36.408750848 +0000 UTC m=+260.946945979" Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.441128 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:36 crc kubenswrapper[4745]: I0319 00:11:36.447073 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-66fbb79cf5-mhgjw"] Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.362137 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerStarted","Data":"cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636"} Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.662533 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.679264 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" podStartSLOduration=11.679238504 podStartE2EDuration="11.679238504s" podCreationTimestamp="2026-03-19 00:11:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:37.388036706 +0000 UTC m=+261.926231847" watchObservedRunningTime="2026-03-19 00:11:37.679238504 +0000 UTC m=+262.217433635" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749136 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") pod \"5698031f-9dc1-4457-a866-2fd312ebfa9e\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749242 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") pod \"5698031f-9dc1-4457-a866-2fd312ebfa9e\" (UID: \"5698031f-9dc1-4457-a866-2fd312ebfa9e\") " Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749295 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5698031f-9dc1-4457-a866-2fd312ebfa9e" (UID: "5698031f-9dc1-4457-a866-2fd312ebfa9e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.749608 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5698031f-9dc1-4457-a866-2fd312ebfa9e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.756550 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5698031f-9dc1-4457-a866-2fd312ebfa9e" (UID: "5698031f-9dc1-4457-a866-2fd312ebfa9e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:37 crc kubenswrapper[4745]: I0319 00:11:37.851174 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5698031f-9dc1-4457-a866-2fd312ebfa9e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.145622 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98407a3b-8601-4632-b5b0-9308cfe2dbb6" path="/var/lib/kubelet/pods/98407a3b-8601-4632-b5b0-9308cfe2dbb6/volumes" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368206 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5698031f-9dc1-4457-a866-2fd312ebfa9e","Type":"ContainerDied","Data":"d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae"} Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368248 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d51d30df499606a711683bc76abba35055a73b43b636926a63aab2f8353386ae" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.368480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.369311 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 00:11:38 crc kubenswrapper[4745]: I0319 00:11:38.374418 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.385425 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerStarted","Data":"76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0"} Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.404064 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" podStartSLOduration=45.706455093 podStartE2EDuration="1m41.404038501s" podCreationTimestamp="2026-03-19 00:10:00 +0000 UTC" firstStartedPulling="2026-03-19 00:10:45.253765238 +0000 UTC m=+209.791960369" lastFinishedPulling="2026-03-19 00:11:40.951348646 +0000 UTC m=+265.489543777" observedRunningTime="2026-03-19 00:11:41.399268778 +0000 UTC m=+265.937463929" watchObservedRunningTime="2026-03-19 00:11:41.404038501 +0000 UTC m=+265.942233632" Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.589405 4745 csr.go:261] certificate signing request csr-r6h9t is approved, waiting to be issued Mar 19 00:11:41 crc kubenswrapper[4745]: I0319 00:11:41.595812 4745 csr.go:257] certificate signing request csr-r6h9t is issued Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.392768 4745 generic.go:334] "Generic (PLEG): container finished" podID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerID="76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0" exitCode=0 Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.392818 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerDied","Data":"76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0"} Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.597525 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-06 08:37:05.371483213 +0000 UTC Mar 19 00:11:42 crc kubenswrapper[4745]: I0319 00:11:42.597903 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7040h25m22.773583555s for next certificate rotation Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.598975 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-02 09:58:54.517442097 +0000 UTC Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.599014 4745 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6201h47m10.918430965s for next certificate rotation Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.693394 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.751566 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") pod \"d14d18f5-0177-4458-8ea3-b266cc96d658\" (UID: \"d14d18f5-0177-4458-8ea3-b266cc96d658\") " Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.757027 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4" (OuterVolumeSpecName: "kube-api-access-c5kr4") pod "d14d18f5-0177-4458-8ea3-b266cc96d658" (UID: "d14d18f5-0177-4458-8ea3-b266cc96d658"). InnerVolumeSpecName "kube-api-access-c5kr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:43 crc kubenswrapper[4745]: I0319 00:11:43.852634 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5kr4\" (UniqueName: \"kubernetes.io/projected/d14d18f5-0177-4458-8ea3-b266cc96d658-kube-api-access-c5kr4\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402423 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" event={"ID":"d14d18f5-0177-4458-8ea3-b266cc96d658","Type":"ContainerDied","Data":"158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b"} Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402728 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="158cd16e86d97bd741b6d9d3a091e473262326dad949142d4f04bb6f64676d4b" Mar 19 00:11:44 crc kubenswrapper[4745]: I0319 00:11:44.402785 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564650-7k6ld" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.410444 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606399 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606753 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.606799 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.607413 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:11:45 crc kubenswrapper[4745]: I0319 00:11:45.607491 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" gracePeriod=600 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.416803 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" exitCode=0 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.416921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574"} Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.418990 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" exitCode=0 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.419018 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.956693 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.956988 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" containerID="cri-o://cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" gracePeriod=30 Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.971699 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:46 crc kubenswrapper[4745]: I0319 00:11:46.972571 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" containerID="cri-o://58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" gracePeriod=30 Mar 19 00:11:47 crc kubenswrapper[4745]: I0319 00:11:47.394631 4745 patch_prober.go:28] interesting pod/route-controller-manager-65b7fbb54-twj8q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 19 00:11:47 crc kubenswrapper[4745]: I0319 00:11:47.394736 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.434512 4745 generic.go:334] "Generic (PLEG): container finished" podID="9c10585a-574b-4a55-8b88-9997418b9e02" containerID="58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" exitCode=0 Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.434593 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerDied","Data":"58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281"} Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.436049 4745 generic.go:334] "Generic (PLEG): container finished" podID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerID="cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" exitCode=0 Mar 19 00:11:48 crc kubenswrapper[4745]: I0319 00:11:48.436073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerDied","Data":"cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636"} Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.794508 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.829688 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830130 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830150 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830167 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830174 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: E0319 00:11:51.830185 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830192 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830320 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" containerName="oc" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830334 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="5698031f-9dc1-4457-a866-2fd312ebfa9e" containerName="pruner" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830346 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" containerName="route-controller-manager" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.830807 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.834379 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874458 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874561 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874701 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.874726 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") pod \"9c10585a-574b-4a55-8b88-9997418b9e02\" (UID: \"9c10585a-574b-4a55-8b88-9997418b9e02\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.876910 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877045 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877246 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877279 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config" (OuterVolumeSpecName: "config") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877448 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877617 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.877854 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca" (OuterVolumeSpecName: "client-ca") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.887144 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.887164 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm" (OuterVolumeSpecName: "kube-api-access-h7qbm") pod "9c10585a-574b-4a55-8b88-9997418b9e02" (UID: "9c10585a-574b-4a55-8b88-9997418b9e02"). InnerVolumeSpecName "kube-api-access-h7qbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.914759 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978507 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978588 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978623 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") pod \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\" (UID: \"14ce067d-78a6-4ed3-9295-fb73f2b931fb\") " Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978757 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978779 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978827 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978861 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978944 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7qbm\" (UniqueName: \"kubernetes.io/projected/9c10585a-574b-4a55-8b88-9997418b9e02-kube-api-access-h7qbm\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978959 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c10585a-574b-4a55-8b88-9997418b9e02-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.978970 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c10585a-574b-4a55-8b88-9997418b9e02-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.980119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.980464 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca" (OuterVolumeSpecName: "client-ca") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.981946 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config" (OuterVolumeSpecName: "config") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.982331 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.982661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.984080 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.984611 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5" (OuterVolumeSpecName: "kube-api-access-hvgx5") pod "14ce067d-78a6-4ed3-9295-fb73f2b931fb" (UID: "14ce067d-78a6-4ed3-9295-fb73f2b931fb"). InnerVolumeSpecName "kube-api-access-hvgx5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.987819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:51 crc kubenswrapper[4745]: I0319 00:11:51.994046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"route-controller-manager-f5d9bb49-v5xtz\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080862 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080917 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvgx5\" (UniqueName: \"kubernetes.io/projected/14ce067d-78a6-4ed3-9295-fb73f2b931fb-kube-api-access-hvgx5\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080929 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080938 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/14ce067d-78a6-4ed3-9295-fb73f2b931fb-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.080948 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14ce067d-78a6-4ed3-9295-fb73f2b931fb-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.206818 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.460484 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" exitCode=0 Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.460565 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462596 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462582 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q" event={"ID":"9c10585a-574b-4a55-8b88-9997418b9e02","Type":"ContainerDied","Data":"77173caa51e84ae390dbeedd6df17d40cd949d3139f39b0e78d88f448392b2b6"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.462751 4745 scope.go:117] "RemoveContainer" containerID="58ce3dd279b59a910658de2cc809956b25c1c5a660cb3d2371dd1d7acd584281" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.466131 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.466129 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c4857fb46-nxt5t" event={"ID":"14ce067d-78a6-4ed3-9295-fb73f2b931fb","Type":"ContainerDied","Data":"4df8c028721f86f64f802110e34a6846149248c20cfe8d5077d6d03475ad3327"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.468441 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.483636 4745 scope.go:117] "RemoveContainer" containerID="cd7d8b9f0a89687fb705d0a182049cc385f2250f150a6c876b33a23f8286e636" Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.491259 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.494825 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-65b7fbb54-twj8q"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.516470 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.519642 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c4857fb46-nxt5t"] Mar 19 00:11:52 crc kubenswrapper[4745]: I0319 00:11:52.605735 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:11:52 crc kubenswrapper[4745]: W0319 00:11:52.612734 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb08b6e2_c86d_4188_94dd_5605fe96f0dc.slice/crio-7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706 WatchSource:0}: Error finding container 7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706: Status 404 returned error can't find the container with id 7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706 Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.480004 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerStarted","Data":"f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a"} Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.480500 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerStarted","Data":"7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706"} Mar 19 00:11:53 crc kubenswrapper[4745]: I0319 00:11:53.515756 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" podStartSLOduration=6.515721802 podStartE2EDuration="6.515721802s" podCreationTimestamp="2026-03-19 00:11:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:53.502910244 +0000 UTC m=+278.041105405" watchObservedRunningTime="2026-03-19 00:11:53.515721802 +0000 UTC m=+278.053916933" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.160354 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" path="/var/lib/kubelet/pods/14ce067d-78a6-4ed3-9295-fb73f2b931fb/volumes" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.162373 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c10585a-574b-4a55-8b88-9997418b9e02" path="/var/lib/kubelet/pods/9c10585a-574b-4a55-8b88-9997418b9e02/volumes" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.489503 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.491508 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.491588 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.494714 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.494759 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.502029 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerStarted","Data":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.504332 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerStarted","Data":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.507308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.509968 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.510040 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512240 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" exitCode=0 Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5"} Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.512655 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.520823 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534041 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:54 crc kubenswrapper[4745]: E0319 00:11:54.534340 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534362 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.534504 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ce067d-78a6-4ed3-9295-fb73f2b931fb" containerName="controller-manager" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.535127 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.541454 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.542135 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.543165 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.596741 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.601551 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.601758 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.608153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.614068 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.617527 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mtjq5" podStartSLOduration=3.56730852 podStartE2EDuration="1m3.617508361s" podCreationTimestamp="2026-03-19 00:10:51 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.667136009 +0000 UTC m=+218.205331140" lastFinishedPulling="2026-03-19 00:11:53.71733585 +0000 UTC m=+278.255530981" observedRunningTime="2026-03-19 00:11:54.561393249 +0000 UTC m=+279.099588380" watchObservedRunningTime="2026-03-19 00:11:54.617508361 +0000 UTC m=+279.155703492" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619394 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619456 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.619574 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.631031 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q9zn6" podStartSLOduration=4.328831615 podStartE2EDuration="1m6.631010692s" podCreationTimestamp="2026-03-19 00:10:48 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.116172097 +0000 UTC m=+215.654367228" lastFinishedPulling="2026-03-19 00:11:53.418351174 +0000 UTC m=+277.956546305" observedRunningTime="2026-03-19 00:11:54.624479163 +0000 UTC m=+279.162674294" watchObservedRunningTime="2026-03-19 00:11:54.631010692 +0000 UTC m=+279.169205823" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721138 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721225 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721285 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721324 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.721359 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.723262 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.723257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.731728 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.754512 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.839850 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"controller-manager-6c6948f6c9-85v5z\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:54 crc kubenswrapper[4745]: I0319 00:11:54.918663 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.197307 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.532385 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" exitCode=0 Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.532904 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.540527 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" exitCode=0 Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.540637 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.554222 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerStarted","Data":"db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.554308 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerStarted","Data":"27758a5fbf789728ef48779744e310672915539b075e2b4fbe9182cb884b96c3"} Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.557444 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.560400 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:11:55 crc kubenswrapper[4745]: I0319 00:11:55.594177 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" podStartSLOduration=9.594155933 podStartE2EDuration="9.594155933s" podCreationTimestamp="2026-03-19 00:11:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:11:55.591300872 +0000 UTC m=+280.129496013" watchObservedRunningTime="2026-03-19 00:11:55.594155933 +0000 UTC m=+280.132351064" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:56.560275 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerStarted","Data":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:56.579457 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-g5dw2" podStartSLOduration=2.876740324 podStartE2EDuration="1m7.579437712s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.187609261 +0000 UTC m=+215.725804392" lastFinishedPulling="2026-03-19 00:11:55.890306649 +0000 UTC m=+280.428501780" observedRunningTime="2026-03-19 00:11:56.577989556 +0000 UTC m=+281.116184687" watchObservedRunningTime="2026-03-19 00:11:56.579437712 +0000 UTC m=+281.117632843" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:57.567869 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerStarted","Data":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:57.586759 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hhfzg" podStartSLOduration=3.218344471 podStartE2EDuration="1m8.586741223s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.148460931 +0000 UTC m=+215.686656052" lastFinishedPulling="2026-03-19 00:11:56.516857673 +0000 UTC m=+281.055052804" observedRunningTime="2026-03-19 00:11:57.58666226 +0000 UTC m=+282.124857401" watchObservedRunningTime="2026-03-19 00:11:57.586741223 +0000 UTC m=+282.124936354" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.367424 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.367792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.559557 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.559859 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.579999 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerStarted","Data":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.928001 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:11:59 crc kubenswrapper[4745]: I0319 00:11:59.928375 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.133459 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.134142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.135646 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.136436 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.140631 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.151050 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.287863 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.339488 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.340492 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.342477 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.393059 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.416144 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"auto-csr-approver-29564652-nhhsh\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.420860 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.451698 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:00 crc kubenswrapper[4745]: I0319 00:12:00.604670 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wf9ss" podStartSLOduration=4.899620489 podStartE2EDuration="1m9.604645129s" podCreationTimestamp="2026-03-19 00:10:51 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.667420379 +0000 UTC m=+218.205615510" lastFinishedPulling="2026-03-19 00:11:58.372445019 +0000 UTC m=+282.910640150" observedRunningTime="2026-03-19 00:12:00.600781776 +0000 UTC m=+285.138976907" watchObservedRunningTime="2026-03-19 00:12:00.604645129 +0000 UTC m=+285.142840260" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.181334 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:12:01 crc kubenswrapper[4745]: W0319 00:12:01.184240 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ef55829_c74d_4c78_b9b9_1c3ea05456e9.slice/crio-ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13 WatchSource:0}: Error finding container ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13: Status 404 returned error can't find the container with id ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13 Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.533746 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.533802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.587480 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.595073 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerStarted","Data":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.596872 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerStarted","Data":"ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13"} Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.645073 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.646016 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.933998 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.934394 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:01 crc kubenswrapper[4745]: I0319 00:12:01.972053 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.170872 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.605115 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerStarted","Data":"117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6"} Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.607520 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerStarted","Data":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.624449 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kgsn7" podStartSLOduration=2.467508418 podStartE2EDuration="1m13.624415847s" podCreationTimestamp="2026-03-19 00:10:49 +0000 UTC" firstStartedPulling="2026-03-19 00:10:51.134133764 +0000 UTC m=+215.672328895" lastFinishedPulling="2026-03-19 00:12:02.291041183 +0000 UTC m=+286.829236324" observedRunningTime="2026-03-19 00:12:02.623480157 +0000 UTC m=+287.161675308" watchObservedRunningTime="2026-03-19 00:12:02.624415847 +0000 UTC m=+287.162611008" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.644313 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cgghw" podStartSLOduration=3.283028802 podStartE2EDuration="1m10.644289402s" podCreationTimestamp="2026-03-19 00:10:52 +0000 UTC" firstStartedPulling="2026-03-19 00:10:54.881193501 +0000 UTC m=+219.419388632" lastFinishedPulling="2026-03-19 00:12:02.242454101 +0000 UTC m=+286.780649232" observedRunningTime="2026-03-19 00:12:02.642139542 +0000 UTC m=+287.180334683" watchObservedRunningTime="2026-03-19 00:12:02.644289402 +0000 UTC m=+287.182484523" Mar 19 00:12:02 crc kubenswrapper[4745]: I0319 00:12:02.668022 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-75vmv" podStartSLOduration=3.569932053 podStartE2EDuration="1m10.667999678s" podCreationTimestamp="2026-03-19 00:10:52 +0000 UTC" firstStartedPulling="2026-03-19 00:10:53.683011427 +0000 UTC m=+218.221206558" lastFinishedPulling="2026-03-19 00:12:00.781079052 +0000 UTC m=+285.319274183" observedRunningTime="2026-03-19 00:12:02.665735566 +0000 UTC m=+287.203930697" watchObservedRunningTime="2026-03-19 00:12:02.667999678 +0000 UTC m=+287.206194809" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.059781 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.059837 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.613093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerStarted","Data":"5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284"} Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.613758 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-g5dw2" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" containerID="cri-o://2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" gracePeriod=2 Mar 19 00:12:03 crc kubenswrapper[4745]: I0319 00:12:03.627730 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" podStartSLOduration=1.5222021749999999 podStartE2EDuration="3.62771247s" podCreationTimestamp="2026-03-19 00:12:00 +0000 UTC" firstStartedPulling="2026-03-19 00:12:01.187685735 +0000 UTC m=+285.725880866" lastFinishedPulling="2026-03-19 00:12:03.29319603 +0000 UTC m=+287.831391161" observedRunningTime="2026-03-19 00:12:03.62521038 +0000 UTC m=+288.163405501" watchObservedRunningTime="2026-03-19 00:12:03.62771247 +0000 UTC m=+288.165907601" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.078345 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.099155 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" probeResult="failure" output=< Mar 19 00:12:04 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:12:04 crc kubenswrapper[4745]: > Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147328 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.147555 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") pod \"ee0bf814-e571-41fe-9265-b77d8b53e20f\" (UID: \"ee0bf814-e571-41fe-9265-b77d8b53e20f\") " Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.148405 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities" (OuterVolumeSpecName: "utilities") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.155591 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg" (OuterVolumeSpecName: "kube-api-access-695vg") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "kube-api-access-695vg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.210654 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ee0bf814-e571-41fe-9265-b77d8b53e20f" (UID: "ee0bf814-e571-41fe-9265-b77d8b53e20f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249231 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249291 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-695vg\" (UniqueName: \"kubernetes.io/projected/ee0bf814-e571-41fe-9265-b77d8b53e20f-kube-api-access-695vg\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.249307 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ee0bf814-e571-41fe-9265-b77d8b53e20f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.622251 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerID="5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284" exitCode=0 Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.622335 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerDied","Data":"5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625149 4745 generic.go:334] "Generic (PLEG): container finished" podID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" exitCode=0 Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625213 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-g5dw2" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625228 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-g5dw2" event={"ID":"ee0bf814-e571-41fe-9265-b77d8b53e20f","Type":"ContainerDied","Data":"5cf5d6c8b2c76c4c80fde1c17a6532692631c387bb1a224bdbe1f73591bc68b3"} Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.625300 4745 scope.go:117] "RemoveContainer" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.646993 4745 scope.go:117] "RemoveContainer" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.655052 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.669553 4745 scope.go:117] "RemoveContainer" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.672584 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-g5dw2"] Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690080 4745 scope.go:117] "RemoveContainer" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.690553 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": container with ID starting with 2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b not found: ID does not exist" containerID="2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690609 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b"} err="failed to get container status \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": rpc error: code = NotFound desc = could not find container \"2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b\": container with ID starting with 2d5a090ff2bf27634b32766a51b6f7807f73e4e78bbdf698598abc243d364c4b not found: ID does not exist" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.690641 4745 scope.go:117] "RemoveContainer" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.691188 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": container with ID starting with 0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c not found: ID does not exist" containerID="0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691222 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c"} err="failed to get container status \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": rpc error: code = NotFound desc = could not find container \"0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c\": container with ID starting with 0d13aeb0c34af8c27e7a7f4e1b780392128dec6bbc6f07954ca9e6142b44d86c not found: ID does not exist" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691268 4745 scope.go:117] "RemoveContainer" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: E0319 00:12:04.691595 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": container with ID starting with 737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925 not found: ID does not exist" containerID="737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925" Mar 19 00:12:04 crc kubenswrapper[4745]: I0319 00:12:04.691613 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925"} err="failed to get container status \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": rpc error: code = NotFound desc = could not find container \"737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925\": container with ID starting with 737c906b9c9a23aaf350bbe9470a53d0ea401bbb7d294ef2643bbff2c8125925 not found: ID does not exist" Mar 19 00:12:05 crc kubenswrapper[4745]: I0319 00:12:05.981404 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.146262 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" path="/var/lib/kubelet/pods/ee0bf814-e571-41fe-9265-b77d8b53e20f/volumes" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.172655 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") pod \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\" (UID: \"9ef55829-c74d-4c78-b9b9-1c3ea05456e9\") " Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.177103 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9" (OuterVolumeSpecName: "kube-api-access-znwm9") pod "9ef55829-c74d-4c78-b9b9-1c3ea05456e9" (UID: "9ef55829-c74d-4c78-b9b9-1c3ea05456e9"). InnerVolumeSpecName "kube-api-access-znwm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.274376 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znwm9\" (UniqueName: \"kubernetes.io/projected/9ef55829-c74d-4c78-b9b9-1c3ea05456e9-kube-api-access-znwm9\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639252 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" event={"ID":"9ef55829-c74d-4c78-b9b9-1c3ea05456e9","Type":"ContainerDied","Data":"ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13"} Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639606 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee55d738618d399b2b12d1980b8f1fa4c6866fe88fffd45efbd52a50ad123e13" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.639302 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564652-nhhsh" Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.951565 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:06 crc kubenswrapper[4745]: I0319 00:12:06.951761 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" containerID="cri-o://db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" gracePeriod=30 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.044427 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.044626 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" containerID="cri-o://f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" gracePeriod=30 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.647937 4745 generic.go:334] "Generic (PLEG): container finished" podID="590c9ede-2a42-4251-9e05-321d560b674d" containerID="db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" exitCode=0 Mar 19 00:12:07 crc kubenswrapper[4745]: I0319 00:12:07.647988 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerDied","Data":"db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6"} Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.655570 4745 generic.go:334] "Generic (PLEG): container finished" podID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerID="f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" exitCode=0 Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.655706 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerDied","Data":"f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a"} Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.703470 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.708546 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.729866 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730127 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730148 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730167 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730174 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730186 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730195 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730205 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-content" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730213 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-content" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730225 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-utilities" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730232 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="extract-utilities" Mar 19 00:12:08 crc kubenswrapper[4745]: E0319 00:12:08.730242 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730249 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730726 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" containerName="oc" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730751 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" containerName="route-controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730763 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="590c9ede-2a42-4251-9e05-321d560b674d" containerName="controller-manager" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.730798 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee0bf814-e571-41fe-9265-b77d8b53e20f" containerName="registry-server" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.731647 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.744350 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803614 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803652 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803705 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803744 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803783 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") pod \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\" (UID: \"fb08b6e2-c86d-4188-94dd-5605fe96f0dc\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803804 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803818 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803848 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.803905 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") pod \"590c9ede-2a42-4251-9e05-321d560b674d\" (UID: \"590c9ede-2a42-4251-9e05-321d560b674d\") " Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca" (OuterVolumeSpecName: "client-ca") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804749 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804765 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca" (OuterVolumeSpecName: "client-ca") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804778 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config" (OuterVolumeSpecName: "config") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.804838 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config" (OuterVolumeSpecName: "config") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809414 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g" (OuterVolumeSpecName: "kube-api-access-mxr8g") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "kube-api-access-mxr8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809419 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q" (OuterVolumeSpecName: "kube-api-access-f5r2q") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "kube-api-access-f5r2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.809917 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "590c9ede-2a42-4251-9e05-321d560b674d" (UID: "590c9ede-2a42-4251-9e05-321d560b674d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.810057 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fb08b6e2-c86d-4188-94dd-5605fe96f0dc" (UID: "fb08b6e2-c86d-4188-94dd-5605fe96f0dc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904722 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904803 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904824 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904845 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904953 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/590c9ede-2a42-4251-9e05-321d560b674d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904972 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904980 4745 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904990 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.904999 4745 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905008 4745 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905018 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5r2q\" (UniqueName: \"kubernetes.io/projected/590c9ede-2a42-4251-9e05-321d560b674d-kube-api-access-f5r2q\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905027 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxr8g\" (UniqueName: \"kubernetes.io/projected/fb08b6e2-c86d-4188-94dd-5605fe96f0dc-kube-api-access-mxr8g\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:08 crc kubenswrapper[4745]: I0319 00:12:08.905035 4745 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/590c9ede-2a42-4251-9e05-321d560b674d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006476 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006515 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.006533 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.007438 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-client-ca\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.007770 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-proxy-ca-bundles\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.009020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57b68939-d8b8-40ed-b961-67c46d82099e-config\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.009776 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57b68939-d8b8-40ed-b961-67c46d82099e-serving-cert\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.026469 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn6pr\" (UniqueName: \"kubernetes.io/projected/57b68939-d8b8-40ed-b961-67c46d82099e-kube-api-access-rn6pr\") pod \"controller-manager-f964689fb-zqm4l\" (UID: \"57b68939-d8b8-40ed-b961-67c46d82099e\") " pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.057009 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.253404 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f964689fb-zqm4l"] Mar 19 00:12:09 crc kubenswrapper[4745]: W0319 00:12:09.260867 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b68939_d8b8_40ed_b961_67c46d82099e.slice/crio-a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83 WatchSource:0}: Error finding container a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83: Status 404 returned error can't find the container with id a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83 Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.598413 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.661968 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" event={"ID":"57b68939-d8b8-40ed-b961-67c46d82099e","Type":"ContainerStarted","Data":"a03574ee4b866ae4effd82a9566cb03016697e02fbf29e652fa4ee4783ffbb83"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663590 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" event={"ID":"590c9ede-2a42-4251-9e05-321d560b674d","Type":"ContainerDied","Data":"27758a5fbf789728ef48779744e310672915539b075e2b4fbe9182cb884b96c3"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663617 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c6948f6c9-85v5z" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.663648 4745 scope.go:117] "RemoveContainer" containerID="db8d8bd8447a0725df35d66bc2403d44409e33b72b4a06fb932c9b74861957f6" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.664829 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" event={"ID":"fb08b6e2-c86d-4188-94dd-5605fe96f0dc","Type":"ContainerDied","Data":"7c70b71f5c22a1ac77ce6797c842aca3d59f64dd925d211bb1dfe8fe46dc9706"} Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.664901 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.694671 4745 scope.go:117] "RemoveContainer" containerID="f071dc9deab1618a1875395460a8ed8c7772d6925654e51fb0ee08dc1afbdc0a" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.701580 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.704653 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c6948f6c9-85v5z"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.712335 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.716478 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-f5d9bb49-v5xtz"] Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.737513 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.737833 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:09 crc kubenswrapper[4745]: I0319 00:12:09.773405 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.145535 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="590c9ede-2a42-4251-9e05-321d560b674d" path="/var/lib/kubelet/pods/590c9ede-2a42-4251-9e05-321d560b674d/volumes" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.160234 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb08b6e2-c86d-4188-94dd-5605fe96f0dc" path="/var/lib/kubelet/pods/fb08b6e2-c86d-4188-94dd-5605fe96f0dc/volumes" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.480395 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-522nc"] Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.672001 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" event={"ID":"57b68939-d8b8-40ed-b961-67c46d82099e","Type":"ContainerStarted","Data":"83aa938fad8f85173c008b4f63aed73a3eba5a81d0bd0763a971110d2227f0a7"} Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.672201 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.679301 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.695544 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f964689fb-zqm4l" podStartSLOduration=4.695520233 podStartE2EDuration="4.695520233s" podCreationTimestamp="2026-03-19 00:12:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:10.691611408 +0000 UTC m=+295.229806559" watchObservedRunningTime="2026-03-19 00:12:10.695520233 +0000 UTC m=+295.233715374" Mar 19 00:12:10 crc kubenswrapper[4745]: I0319 00:12:10.722078 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.543128 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.544454 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.546855 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.546927 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547040 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547150 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.547150 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.549068 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.554801 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638092 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638156 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638222 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.638248 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739516 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739638 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739666 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.739686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.741074 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-client-ca\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.741482 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-config\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.748540 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-serving-cert\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.755404 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rnth\" (UniqueName: \"kubernetes.io/projected/7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d-kube-api-access-9rnth\") pod \"route-controller-manager-687d544d6d-7tl6q\" (UID: \"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d\") " pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:11 crc kubenswrapper[4745]: I0319 00:12:11.864589 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.042274 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.380259 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.585008 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.585374 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.620591 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.687000 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"6d53d3280e85a953679ea0148898d1c1ea4fc002985162e0b274152b731b26d9"} Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.722361 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.772552 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kgsn7"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.897832 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.898519 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.898745 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899182 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899227 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899238 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899261 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.899266 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" gracePeriod=15 Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900044 4745 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900264 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900283 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900289 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900298 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900315 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900327 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900333 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900342 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900347 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900359 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900365 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900378 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900385 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900478 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900487 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900496 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900503 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900511 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900520 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900530 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900536 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900646 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900654 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900777 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: E0319 00:12:12.900861 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.900868 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 00:12:12 crc kubenswrapper[4745]: I0319 00:12:12.941421 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058717 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058863 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.058982 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059033 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059204 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.059280 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.099110 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.099831 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.100172 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.100571 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.143702 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.144199 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.144553 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.145014 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160776 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160818 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160842 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160858 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160888 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160894 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160914 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160964 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161053 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.161077 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.160948 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: E0319 00:12:13.161626 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.240746 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.457582 4745 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.457860 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.692172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"f29949657637ad0e4307613a120b4e3c081a14fbf9cd618e3408f9c26c82c283"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.693696 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerID="e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.693739 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerDied","Data":"e4a052c3c9127a082f3e369e3635dfa6ffb8b29174c2b0096d5b731709aa71d5"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694223 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694405 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694556 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.694830 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.696233 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.697508 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698412 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698433 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698442 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" exitCode=0 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698450 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" exitCode=2 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.698527 4745 scope.go:117] "RemoveContainer" containerID="a8b22b7d0a2835d920fd46441ff5e0d890826cf5c51282fe13f6c93b2a7d7837" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702385 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kgsn7" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" containerID="cri-o://117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" gracePeriod=2 Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177"} Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702764 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.702792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703168 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703485 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703667 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.703845 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.705648 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.706480 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.706769 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707007 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707183 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:13 crc kubenswrapper[4745]: I0319 00:12:13.707388 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.703437 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.703859 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.711051 4745 generic.go:334] "Generic (PLEG): container finished" podID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerID="117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" exitCode=0 Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.711118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6"} Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.712466 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e"} Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713308 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713643 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.713979 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.714278 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.714729 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.715605 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.787368 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.788009 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.788577 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.789202 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.789730 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.790014 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885106 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.885224 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") pod \"09d29a41-94df-42b0-b7d3-6b47b06a238f\" (UID: \"09d29a41-94df-42b0-b7d3-6b47b06a238f\") " Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.886222 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities" (OuterVolumeSpecName: "utilities") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.891337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49" (OuterVolumeSpecName: "kube-api-access-rkp49") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "kube-api-access-rkp49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.936366 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "09d29a41-94df-42b0-b7d3-6b47b06a238f" (UID: "09d29a41-94df-42b0-b7d3-6b47b06a238f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987108 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987429 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/09d29a41-94df-42b0-b7d3-6b47b06a238f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:14 crc kubenswrapper[4745]: I0319 00:12:14.987444 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rkp49\" (UniqueName: \"kubernetes.io/projected/09d29a41-94df-42b0-b7d3-6b47b06a238f-kube-api-access-rkp49\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.089346 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.089786 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090181 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090553 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090755 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.090988 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188728 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188871 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") pod \"cb131d37-4be2-4843-9ed6-21fc0636b07f\" (UID: \"cb131d37-4be2-4843-9ed6-21fc0636b07f\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.188975 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189051 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock" (OuterVolumeSpecName: "var-lock") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189148 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.189167 4745 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cb131d37-4be2-4843-9ed6-21fc0636b07f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.192534 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cb131d37-4be2-4843-9ed6-21fc0636b07f" (UID: "cb131d37-4be2-4843-9ed6-21fc0636b07f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.290545 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cb131d37-4be2-4843-9ed6-21fc0636b07f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.717168 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.717562 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.724973 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.725759 4745 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" exitCode=0 Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kgsn7" event={"ID":"09d29a41-94df-42b0-b7d3-6b47b06a238f","Type":"ContainerDied","Data":"9abf27759067afc6e0e47fd16a9553d00f8c09a95f77720200901bcf6c2854bb"} Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728504 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kgsn7" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.728504 4745 scope.go:117] "RemoveContainer" containerID="117f361e74f1b08965125d9bb35ef740e9e3e9deac5263171a3478a453bec6f6" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729331 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729608 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.729846 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730195 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730570 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"cb131d37-4be2-4843-9ed6-21fc0636b07f","Type":"ContainerDied","Data":"0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7"} Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730803 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.730807 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0687eea174b3e10ed9c2288a4a55fbe08d67199218b910f91a8f0259a91524d7" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.735617 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.736607 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737214 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737447 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737677 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.737974 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.738375 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.738802 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.743820 4745 scope.go:117] "RemoveContainer" containerID="0f78ef6f7929f92b1cadbd575e901ee75c29f7405f9ec002e10bc0dc0774c85c" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.745818 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.746317 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.746697 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747093 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747376 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747609 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.747966 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748243 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748521 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.748814 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.749215 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.749472 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.759052 4745 scope.go:117] "RemoveContainer" containerID="3c0dd7e0c251e39bd78fdfc535f458fe29dccacfeda18a0fdd0fe102becb3d5f" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897101 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897153 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897209 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897533 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897571 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.897592 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998607 4745 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998636 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:15 crc kubenswrapper[4745]: I0319 00:12:15.998644 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141227 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141485 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141637 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141778 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.141965 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.142108 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.147142 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.742312 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.743161 4745 scope.go:117] "RemoveContainer" containerID="9b79eba40c8ce458b8450d8e1c1788cfb24db3a40d1afdb5dbf71da8f915dd12" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.743341 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.744290 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.744608 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.745189 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.745943 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.746209 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.746577 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747057 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747495 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.747813 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748274 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748560 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.748820 4745 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.769223 4745 scope.go:117] "RemoveContainer" containerID="6e7287ef2abc7adce5bd14f698b62c984af5550fb2cdb4e8f75c27b4ffb895d5" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.786783 4745 scope.go:117] "RemoveContainer" containerID="dbe19c4a065115099f3da5a7be377b79e10cf298864c3dc53774918b3cb0e210" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.800218 4745 scope.go:117] "RemoveContainer" containerID="1a689b156ff68b385a128c0b55f032223ef97fc9ca0d0e7e55500bf78fc0fb77" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.815784 4745 scope.go:117] "RemoveContainer" containerID="b2ccff57bcc9e8d93a9eb1b2771fda9dbad12fced88b8bc57bbc512111d0aad6" Mar 19 00:12:16 crc kubenswrapper[4745]: I0319 00:12:16.837688 4745 scope.go:117] "RemoveContainer" containerID="a398e5c5a761a9fa65c88f0395e0209e532fa56f1f75a4f94e877df205da09a1" Mar 19 00:12:17 crc kubenswrapper[4745]: E0319 00:12:17.159428 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.488936 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.489725 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490294 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490688 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.490993 4745 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:19 crc kubenswrapper[4745]: I0319 00:12:19.491026 4745 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.491379 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="200ms" Mar 19 00:12:19 crc kubenswrapper[4745]: E0319 00:12:19.692874 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="400ms" Mar 19 00:12:20 crc kubenswrapper[4745]: E0319 00:12:20.093754 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="800ms" Mar 19 00:12:20 crc kubenswrapper[4745]: E0319 00:12:20.894931 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="1.6s" Mar 19 00:12:22 crc kubenswrapper[4745]: E0319 00:12:22.175187 4745 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" volumeName="registry-storage" Mar 19 00:12:22 crc kubenswrapper[4745]: E0319 00:12:22.496279 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="3.2s" Mar 19 00:12:22 crc kubenswrapper[4745]: I0319 00:12:22.865948 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:22 crc kubenswrapper[4745]: I0319 00:12:22.866010 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:25 crc kubenswrapper[4745]: E0319 00:12:25.698031 4745 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.158:6443: connect: connection refused" interval="6.4s" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.141676 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.142430 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.142844 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.143290 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:26 crc kubenswrapper[4745]: I0319 00:12:26.144008 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.137086 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.137926 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.138427 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.138702 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.139382 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.139719 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.158146 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.158504 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.159068 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.159637 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.160041 4745 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/events\": dial tcp 38.102.83.158:6443: connect: connection refused" event="&Event{ObjectMeta:{route-controller-manager-687d544d6d-7tl6q.189e15ac568c56f4 openshift-route-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-route-controller-manager,Name:route-controller-manager-687d544d6d-7tl6q,UID:7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d,APIVersion:v1,ResourceVersion:30018,FieldPath:spec.containers{route-controller-manager},},Reason:Created,Message:Created container route-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,LastTimestamp:2026-03-19 00:12:13.161150196 +0000 UTC m=+297.699345327,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 00:12:27 crc kubenswrapper[4745]: W0319 00:12:27.196369 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146 WatchSource:0}: Error finding container 906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146: Status 404 returned error can't find the container with id 906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.808067 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809074 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809328 4745 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb" exitCode=1 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.809398 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.810872 4745 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811125 4745 scope.go:117] "RemoveContainer" containerID="0813dbc0721d720a11edc789ec3ffe0f14fec3274030a66127a3bd959c79a6fb" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811265 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811143 4745 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="ee7c6817aea68de19f15cb78d4e294a5507dc112393300176d86df7160f985ea" exitCode=0 Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811170 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"ee7c6817aea68de19f15cb78d4e294a5507dc112393300176d86df7160f985ea"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811475 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"906998f27c7e09e4a2e06afeeea4c6a74a6a887a1235b39719ed19e7be002146"} Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811812 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811836 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.811919 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: E0319 00:12:27.812274 4745 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.812409 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813033 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813284 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.813669 4745 status_manager.go:851] "Failed to get status for pod" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" pod="openshift-marketplace/redhat-operators-cgghw" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-cgghw\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.814127 4745 status_manager.go:851] "Failed to get status for pod" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-687d544d6d-7tl6q\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.814577 4745 status_manager.go:851] "Failed to get status for pod" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.815059 4745 status_manager.go:851] "Failed to get status for pod" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" pod="openshift-marketplace/community-operators-kgsn7" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-kgsn7\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.815614 4745 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.816123 4745 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.158:6443: connect: connection refused" Mar 19 00:12:27 crc kubenswrapper[4745]: I0319 00:12:27.981400 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.825514 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"02f97eec88725c668a3b4eae9c7207237aeb66449f48a325a7e6568b50e7da18"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826067 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4be914544e9953ce510ed2ad681f3ae31cb11387a94fcb7e4e94a4fac099aca7"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826084 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ea904d0e0bf3b380fa7e494ce26ef4a72587aed13e33ee323cc4a8808bcd4d8f"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.826096 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"c439e8c39490851b522803b1283e21d698b19152abfe504e39a3d05254dae0ee"} Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.834790 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.835646 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 00:12:28 crc kubenswrapper[4745]: I0319 00:12:28.835777 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e170bd623e237b7d4480aa146ecbfaf91651cdb0a850dac4e09af99860e21ab4"} Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.849400 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4ab527b0f206fadb762890f83bf5fd29549aefcca2298b8947f66a628f495ec7"} Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.849768 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.850521 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:29 crc kubenswrapper[4745]: I0319 00:12:29.850595 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.160202 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.160530 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.165149 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.526702 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.865586 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:32 crc kubenswrapper[4745]: I0319 00:12:32.865645 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:34 crc kubenswrapper[4745]: I0319 00:12:34.859574 4745 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.527377 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" containerID="cri-o://5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" gracePeriod=15 Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.880605 4745 generic.go:334] "Generic (PLEG): container finished" podID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerID="5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" exitCode=0 Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.880688 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerDied","Data":"5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360"} Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.881277 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.881294 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.884929 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:35 crc kubenswrapper[4745]: I0319 00:12:35.960034 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037245 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037296 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037326 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037387 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037454 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037476 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037496 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037522 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037548 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037574 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037596 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.037618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") pod \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\" (UID: \"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4\") " Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.038602 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.038658 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.039107 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.039742 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.040267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.044947 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045056 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz" (OuterVolumeSpecName: "kube-api-access-88wjz") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "kube-api-access-88wjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045143 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045495 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.045611 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046034 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.046256 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" (UID: "d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138246 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138278 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138290 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88wjz\" (UniqueName: \"kubernetes.io/projected/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-kube-api-access-88wjz\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138300 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138310 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138319 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138328 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138337 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138346 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138354 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138363 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138372 4745 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138382 4745 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.138391 4745 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.171383 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4e655125-59af-476c-8508-2b9550782d73" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888343 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" event={"ID":"d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4","Type":"ContainerDied","Data":"6ed318766e28a7953b973ede1884a1baaf2b2983a95f1b27491abc323fc1f4ae"} Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888401 4745 scope.go:117] "RemoveContainer" containerID="5e97a389fffe604d654de61eb43dad886ba3f0e357ed64056f25bb2e0ae3b360" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888411 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-522nc" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888561 4745 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.888586 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="bd7a8e96-6428-45d1-90bf-0e26563710a7" Mar 19 00:12:36 crc kubenswrapper[4745]: I0319 00:12:36.895198 4745 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="4e655125-59af-476c-8508-2b9550782d73" Mar 19 00:12:37 crc kubenswrapper[4745]: I0319 00:12:37.981564 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:37 crc kubenswrapper[4745]: I0319 00:12:37.986010 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:38 crc kubenswrapper[4745]: I0319 00:12:38.902942 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 00:12:42 crc kubenswrapper[4745]: I0319 00:12:42.866226 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:42 crc kubenswrapper[4745]: I0319 00:12:42.866623 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.587388 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937630 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-687d544d6d-7tl6q_7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d/route-controller-manager/0.log" Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937691 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerID="ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177" exitCode=255 Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.937727 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerDied","Data":"ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177"} Mar 19 00:12:44 crc kubenswrapper[4745]: I0319 00:12:44.938259 4745 scope.go:117] "RemoveContainer" containerID="ac5267107b617a38d4d92ce01c7ea3aff52496bb39844e8c8ebed6d6e23fd177" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.085174 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.556083 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.618469 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.730975 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.805525 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.944610 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-687d544d6d-7tl6q_7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d/route-controller-manager/0.log" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.944894 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" event={"ID":"7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d","Type":"ContainerStarted","Data":"5ab9c0a2bf10341d7966c87c620df68f7517e654098d7f145806f21deb455dbf"} Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.945289 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:45 crc kubenswrapper[4745]: I0319 00:12:45.969147 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.104905 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.171903 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.209493 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.243464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.435010 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.513563 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.828425 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.945731 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:46 crc kubenswrapper[4745]: I0319 00:12:46.945826 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.228530 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.373758 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.399457 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.441807 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.513495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.534348 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.549323 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.674666 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.735515 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.825037 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.830585 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.884397 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.920078 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.950280 4745 patch_prober.go:28] interesting pod/route-controller-manager-687d544d6d-7tl6q container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.950585 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podUID="7ec6c7ac-820c-4cac-be05-ada6fb1a7d1d" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.983784 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 00:12:47 crc kubenswrapper[4745]: I0319 00:12:47.997996 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.059855 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.103085 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.127015 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.170260 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.213862 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.269032 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.270590 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.286562 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.391617 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.586012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.587261 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.601663 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.689488 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.722315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.760088 4745 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.760519 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.760501035 podStartE2EDuration="36.760501035s" podCreationTimestamp="2026-03-19 00:12:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:34.544287788 +0000 UTC m=+319.082482939" watchObservedRunningTime="2026-03-19 00:12:48.760501035 +0000 UTC m=+333.298696166" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.762396 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" podStartSLOduration=41.762385585 podStartE2EDuration="41.762385585s" podCreationTimestamp="2026-03-19 00:12:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:34.598988985 +0000 UTC m=+319.137184136" watchObservedRunningTime="2026-03-19 00:12:48.762385585 +0000 UTC m=+333.300580716" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.763991 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.764854 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-522nc","openshift-marketplace/community-operators-kgsn7"] Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.764979 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.768619 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.780828 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=14.780810087999999 podStartE2EDuration="14.780810088s" podCreationTimestamp="2026-03-19 00:12:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:12:48.780421336 +0000 UTC m=+333.318616467" watchObservedRunningTime="2026-03-19 00:12:48.780810088 +0000 UTC m=+333.319005219" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.802247 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.808244 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.834587 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 00:12:48 crc kubenswrapper[4745]: I0319 00:12:48.947795 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.025643 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.037091 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.125169 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.227592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.449818 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.576171 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.622920 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.622956 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.663090 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.804725 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.865376 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.882989 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.908664 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.921058 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 00:12:49 crc kubenswrapper[4745]: I0319 00:12:49.937583 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.001323 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.045632 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.045903 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.125825 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.144620 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" path="/var/lib/kubelet/pods/09d29a41-94df-42b0-b7d3-6b47b06a238f/volumes" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.145325 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" path="/var/lib/kubelet/pods/d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4/volumes" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.156405 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.212371 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.227174 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.250125 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.349119 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.364286 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.370828 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.499381 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.567604 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.667001 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.698153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.854064 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.908842 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 00:12:50 crc kubenswrapper[4745]: I0319 00:12:50.961453 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.034511 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.063268 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.087602 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.173262 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.192323 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.194153 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.277281 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.327379 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.397128 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.458134 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.486759 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.492406 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.497822 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.555873 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.642791 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.660702 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.707995 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.775737 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.825391 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.895112 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.895170 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.949032 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.951208 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.964254 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 00:12:51 crc kubenswrapper[4745]: I0319 00:12:51.993347 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.096600 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.143197 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.149685 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.160361 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.164019 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.201711 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.310736 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.329740 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.333596 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.494997 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.503383 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.555261 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.640538 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.647574 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.690315 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.745442 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.751875 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.842678 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-687d544d6d-7tl6q" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.842846 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.867793 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.902495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 00:12:52 crc kubenswrapper[4745]: I0319 00:12:52.927297 4745 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.038237 4745 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.177732 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.197742 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.243418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.284576 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.329776 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.467556 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.518953 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.596131 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.614683 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.623970 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.656874 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.687474 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.833921 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 00:12:53 crc kubenswrapper[4745]: I0319 00:12:53.936936 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.071406 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.174802 4745 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.188754 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.233221 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.259953 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.521600 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.541327 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.541463 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.707696 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.769793 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.802027 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 00:12:54 crc kubenswrapper[4745]: I0319 00:12:54.893939 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.037763 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.136007 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.163133 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.241579 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.269639 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.290466 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.291301 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.297634 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.362973 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.391239 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.406950 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.436722 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.459516 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.463915 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.537773 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.537874 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.550819 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.767843 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 00:12:55 crc kubenswrapper[4745]: I0319 00:12:55.980815 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.073681 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.078592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.099092 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.306098 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.390102 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.418002 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.686644 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.758717 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.768346 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.801393 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.836998 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.915512 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.967757 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:12:56 crc kubenswrapper[4745]: I0319 00:12:56.973963 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.114625 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.235192 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.288476 4745 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.289014 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" gracePeriod=5 Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.309592 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.332256 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.377155 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.407003 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.437341 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.488962 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.545684 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.553856 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.580283 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.677912 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.716979 4745 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.808335 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.818144 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.826664 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 00:12:57 crc kubenswrapper[4745]: I0319 00:12:57.917535 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.020495 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.087858 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.149283 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.150065 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.166319 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.282838 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.304933 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.335137 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.372590 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.415317 4745 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.468666 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.470120 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593481 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593695 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593706 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593714 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-content" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593722 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-content" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593731 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593737 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593746 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593753 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593767 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-utilities" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593774 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="extract-utilities" Mar 19 00:12:58 crc kubenswrapper[4745]: E0319 00:12:58.593785 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593791 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593870 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="09d29a41-94df-42b0-b7d3-6b47b06a238f" containerName="registry-server" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593897 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d08496f4-3cf5-4cb6-8d31-81eb6a7f8ce4" containerName="oauth-openshift" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593912 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.593926 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb131d37-4be2-4843-9ed6-21fc0636b07f" containerName="installer" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.594276 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597588 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597612 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.597696 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.598571 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.599668 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602440 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602569 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602675 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.602732 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.603484 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.604570 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.605115 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.605133 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.611672 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.616070 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.618129 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.619917 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699804 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.699997 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700022 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700129 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700210 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700275 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700350 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700369 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.700407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.715631 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801731 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801794 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801831 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801854 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801894 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801916 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801933 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801948 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801973 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.801992 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802043 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802062 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802828 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-policies\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.802903 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/eda29b61-f024-422b-8558-d7d5c4ef1bfa-audit-dir\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.803310 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-service-ca\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.803712 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.804275 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.805782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.810143 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-error\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.810224 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.811011 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-router-certs\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.815726 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.819698 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.820733 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-system-session\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.821098 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb2zt\" (UniqueName: \"kubernetes.io/projected/eda29b61-f024-422b-8558-d7d5c4ef1bfa-kube-api-access-wb2zt\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.823058 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.827626 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/eda29b61-f024-422b-8558-d7d5c4ef1bfa-v4-0-config-user-template-login\") pod \"oauth-openshift-5fc96db957-9rmlw\" (UID: \"eda29b61-f024-422b-8558-d7d5c4ef1bfa\") " pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.841448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 00:12:58 crc kubenswrapper[4745]: I0319 00:12:58.911698 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.079393 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.195676 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.196371 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.335189 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.356094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5fc96db957-9rmlw"] Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.386692 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.444354 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.471400 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.505981 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.506554 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.566553 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 00:12:59 crc kubenswrapper[4745]: I0319 00:12:59.970922 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.023825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" event={"ID":"eda29b61-f024-422b-8558-d7d5c4ef1bfa","Type":"ContainerStarted","Data":"fa8e408cd9744dc60a1a7adbf7b77f161a7b290d421d84041b64bd7e4d2f3d5f"} Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.024191 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.024287 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" event={"ID":"eda29b61-f024-422b-8558-d7d5c4ef1bfa","Type":"ContainerStarted","Data":"c550440c425072db1c1668f1c3846d3b673ba2c62b460a530f56cd7b2897d765"} Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.030585 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.047906 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5fc96db957-9rmlw" podStartSLOduration=50.047866728 podStartE2EDuration="50.047866728s" podCreationTimestamp="2026-03-19 00:12:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:13:00.043709697 +0000 UTC m=+344.581904858" watchObservedRunningTime="2026-03-19 00:13:00.047866728 +0000 UTC m=+344.586061869" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.452921 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.603499 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 00:13:00 crc kubenswrapper[4745]: I0319 00:13:00.611352 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 00:13:01 crc kubenswrapper[4745]: I0319 00:13:01.422772 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.876560 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.877006 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958172 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958561 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958957 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958704 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958758 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.958856 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959041 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959447 4745 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959528 4745 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959639 4745 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.959722 4745 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:02 crc kubenswrapper[4745]: I0319 00:13:02.964396 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.043988 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044056 4745 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" exitCode=137 Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044108 4745 scope.go:117] "RemoveContainer" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.044134 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.060770 4745 scope.go:117] "RemoveContainer" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: E0319 00:13:03.061194 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": container with ID starting with 01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e not found: ID does not exist" containerID="01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.061234 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e"} err="failed to get container status \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": rpc error: code = NotFound desc = could not find container \"01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e\": container with ID starting with 01c404c5882787d1618b8dce53cc7e9042fa4d8c33e753bc865c77f1043e3d8e not found: ID does not exist" Mar 19 00:13:03 crc kubenswrapper[4745]: I0319 00:13:03.061343 4745 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.144662 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.145180 4745 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.154125 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.154171 4745 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c30049d6-78af-46f5-b93d-7a74c53cbc3a" Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.157766 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 00:13:04 crc kubenswrapper[4745]: I0319 00:13:04.157813 4745 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="c30049d6-78af-46f5-b93d-7a74c53cbc3a" Mar 19 00:13:07 crc kubenswrapper[4745]: I0319 00:13:07.645920 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.130726 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" exitCode=0 Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.130786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} Mar 19 00:13:18 crc kubenswrapper[4745]: I0319 00:13:18.131635 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.139935 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerStarted","Data":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.141208 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:13:19 crc kubenswrapper[4745]: I0319 00:13:19.143468 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.300011 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.300820 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wf9ss" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" containerID="cri-o://71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" gracePeriod=2 Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.806858 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975712 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975770 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.975822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") pod \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\" (UID: \"04cc89b5-7bac-4b91-bb97-a1f5ab14260c\") " Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.978700 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities" (OuterVolumeSpecName: "utilities") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:42 crc kubenswrapper[4745]: I0319 00:13:42.982938 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28" (OuterVolumeSpecName: "kube-api-access-8sq28") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "kube-api-access-8sq28". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.004759 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04cc89b5-7bac-4b91-bb97-a1f5ab14260c" (UID: "04cc89b5-7bac-4b91-bb97-a1f5ab14260c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077127 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077159 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.077169 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sq28\" (UniqueName: \"kubernetes.io/projected/04cc89b5-7bac-4b91-bb97-a1f5ab14260c-kube-api-access-8sq28\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265266 4745 generic.go:334] "Generic (PLEG): container finished" podID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" exitCode=0 Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265358 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wf9ss" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265368 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265743 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wf9ss" event={"ID":"04cc89b5-7bac-4b91-bb97-a1f5ab14260c","Type":"ContainerDied","Data":"9d959c62c69d3a59b0e990d8806f87c5f90630c2ae5bf2656f164c9a4cd2a4ac"} Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.265767 4745 scope.go:117] "RemoveContainer" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.291765 4745 scope.go:117] "RemoveContainer" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.304562 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.311794 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wf9ss"] Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.314968 4745 scope.go:117] "RemoveContainer" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343392 4745 scope.go:117] "RemoveContainer" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.343812 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": container with ID starting with 71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba not found: ID does not exist" containerID="71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343858 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba"} err="failed to get container status \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": rpc error: code = NotFound desc = could not find container \"71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba\": container with ID starting with 71db88ad7b9349168e4c71f9e0159cd1e69bb91ca9d61f122727e3b83d5c2bba not found: ID does not exist" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.343905 4745 scope.go:117] "RemoveContainer" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.344307 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": container with ID starting with 982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5 not found: ID does not exist" containerID="982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344334 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5"} err="failed to get container status \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": rpc error: code = NotFound desc = could not find container \"982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5\": container with ID starting with 982199d74b1470a7d5860e355ba73c9af2ec700cdf96753f2cae3cd7683749e5 not found: ID does not exist" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344353 4745 scope.go:117] "RemoveContainer" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: E0319 00:13:43.344530 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": container with ID starting with 6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa not found: ID does not exist" containerID="6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa" Mar 19 00:13:43 crc kubenswrapper[4745]: I0319 00:13:43.344551 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa"} err="failed to get container status \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": rpc error: code = NotFound desc = could not find container \"6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa\": container with ID starting with 6b624c1c091d2590157470e77a902b5dca52979b2dbc0b679df926949f6becaa not found: ID does not exist" Mar 19 00:13:44 crc kubenswrapper[4745]: I0319 00:13:44.144482 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" path="/var/lib/kubelet/pods/04cc89b5-7bac-4b91-bb97-a1f5ab14260c/volumes" Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.499378 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.500225 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cgghw" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" containerID="cri-o://5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" gracePeriod=2 Mar 19 00:13:55 crc kubenswrapper[4745]: I0319 00:13:55.870375 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.030938 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031432 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031469 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") pod \"b19d4fad-672f-40f3-bfdb-53b36da06399\" (UID: \"b19d4fad-672f-40f3-bfdb-53b36da06399\") " Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.031972 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities" (OuterVolumeSpecName: "utilities") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.038393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9" (OuterVolumeSpecName: "kube-api-access-mdzp9") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "kube-api-access-mdzp9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.133426 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.133466 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdzp9\" (UniqueName: \"kubernetes.io/projected/b19d4fad-672f-40f3-bfdb-53b36da06399-kube-api-access-mdzp9\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.167375 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b19d4fad-672f-40f3-bfdb-53b36da06399" (UID: "b19d4fad-672f-40f3-bfdb-53b36da06399"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.234981 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b19d4fad-672f-40f3-bfdb-53b36da06399-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330393 4745 generic.go:334] "Generic (PLEG): container finished" podID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" exitCode=0 Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330441 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330455 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cgghw" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330477 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cgghw" event={"ID":"b19d4fad-672f-40f3-bfdb-53b36da06399","Type":"ContainerDied","Data":"a6f79e4dda71ca031bc419e937553d73dcdca6b9aad0060adce374c4237e4880"} Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.330499 4745 scope.go:117] "RemoveContainer" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.345802 4745 scope.go:117] "RemoveContainer" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.363022 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.367782 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cgghw"] Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.389608 4745 scope.go:117] "RemoveContainer" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.406223 4745 scope.go:117] "RemoveContainer" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407095 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": container with ID starting with 5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c not found: ID does not exist" containerID="5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407141 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c"} err="failed to get container status \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": rpc error: code = NotFound desc = could not find container \"5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c\": container with ID starting with 5596f5b5a16cbf59f4c8d105d7756ffb9cad6414ed54e71bdffde47dfbc9017c not found: ID does not exist" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407169 4745 scope.go:117] "RemoveContainer" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407460 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": container with ID starting with f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370 not found: ID does not exist" containerID="f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407500 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370"} err="failed to get container status \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": rpc error: code = NotFound desc = could not find container \"f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370\": container with ID starting with f4ff930a3fc62366a39421d1adbd43f52a0bfdbd0baadf056178d4965e2df370 not found: ID does not exist" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407527 4745 scope.go:117] "RemoveContainer" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: E0319 00:13:56.407796 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": container with ID starting with eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6 not found: ID does not exist" containerID="eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6" Mar 19 00:13:56 crc kubenswrapper[4745]: I0319 00:13:56.407820 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6"} err="failed to get container status \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": rpc error: code = NotFound desc = could not find container \"eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6\": container with ID starting with eece5e102bb19bd860690d686e6a121b60f66fee3f467c6cf321b96f6bc230c6 not found: ID does not exist" Mar 19 00:13:58 crc kubenswrapper[4745]: I0319 00:13:58.144537 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" path="/var/lib/kubelet/pods/b19d4fad-672f-40f3-bfdb-53b36da06399/volumes" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.164516 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165097 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165110 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165123 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165129 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-utilities" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165142 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165153 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165159 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165165 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165174 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165179 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: E0319 00:14:00.165192 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165199 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="extract-content" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165291 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b19d4fad-672f-40f3-bfdb-53b36da06399" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165303 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="04cc89b5-7bac-4b91-bb97-a1f5ab14260c" containerName="registry-server" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.165642 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.168471 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.168980 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.169343 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.170738 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.287202 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.388669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.410668 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"auto-csr-approver-29564654-j2b7b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.481827 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:00 crc kubenswrapper[4745]: I0319 00:14:00.898430 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:14:01 crc kubenswrapper[4745]: I0319 00:14:01.355871 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerStarted","Data":"4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f"} Mar 19 00:14:03 crc kubenswrapper[4745]: I0319 00:14:03.373547 4745 generic.go:334] "Generic (PLEG): container finished" podID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerID="d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6" exitCode=0 Mar 19 00:14:03 crc kubenswrapper[4745]: I0319 00:14:03.373685 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerDied","Data":"d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6"} Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.635049 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.734812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") pod \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\" (UID: \"f7689b2b-3fcb-4122-bb50-fb8215cdb08b\") " Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.741748 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7" (OuterVolumeSpecName: "kube-api-access-t8tm7") pod "f7689b2b-3fcb-4122-bb50-fb8215cdb08b" (UID: "f7689b2b-3fcb-4122-bb50-fb8215cdb08b"). InnerVolumeSpecName "kube-api-access-t8tm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:04 crc kubenswrapper[4745]: I0319 00:14:04.836203 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8tm7\" (UniqueName: \"kubernetes.io/projected/f7689b2b-3fcb-4122-bb50-fb8215cdb08b-kube-api-access-t8tm7\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384670 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" event={"ID":"f7689b2b-3fcb-4122-bb50-fb8215cdb08b","Type":"ContainerDied","Data":"4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f"} Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384716 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a1aa249e4c789f82a17ac8341e4377f3f91069620b3cc7aa8bf4600e94dcc7f" Mar 19 00:14:05 crc kubenswrapper[4745]: I0319 00:14:05.384730 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564654-j2b7b" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.660721 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:11 crc kubenswrapper[4745]: E0319 00:14:11.662838 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663053 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663223 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" containerName="oc" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.663941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.678482 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.819784 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820343 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820501 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820664 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820774 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.820922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.821080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.821195 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.844160 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922710 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922804 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922823 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.922854 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.923436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/855c57ce-9fe1-4725-87e6-4b777c1fd55f-ca-trust-extracted\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.924101 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-certificates\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.924187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/855c57ce-9fe1-4725-87e6-4b777c1fd55f-trusted-ca\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.928859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/855c57ce-9fe1-4725-87e6-4b777c1fd55f-installation-pull-secrets\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.929405 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-registry-tls\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.941774 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-bound-sa-token\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.941852 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjzj\" (UniqueName: \"kubernetes.io/projected/855c57ce-9fe1-4725-87e6-4b777c1fd55f-kube-api-access-gtjzj\") pod \"image-registry-66df7c8f76-wrzf6\" (UID: \"855c57ce-9fe1-4725-87e6-4b777c1fd55f\") " pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:11 crc kubenswrapper[4745]: I0319 00:14:11.983457 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.181850 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-wrzf6"] Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.424690 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" event={"ID":"855c57ce-9fe1-4725-87e6-4b777c1fd55f","Type":"ContainerStarted","Data":"1738f7f83a17a1cb84f1a84e0a5256950186d622e0c9ce3ed4afb6f9f740b175"} Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.425059 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.425071 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" event={"ID":"855c57ce-9fe1-4725-87e6-4b777c1fd55f","Type":"ContainerStarted","Data":"7ade38938661cc1417844e00b94825e429fabf80b6db6f36ee9c2ef82efca515"} Mar 19 00:14:12 crc kubenswrapper[4745]: I0319 00:14:12.445050 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" podStartSLOduration=1.445029202 podStartE2EDuration="1.445029202s" podCreationTimestamp="2026-03-19 00:14:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:14:12.442652207 +0000 UTC m=+416.980847358" watchObservedRunningTime="2026-03-19 00:14:12.445029202 +0000 UTC m=+416.983224333" Mar 19 00:14:15 crc kubenswrapper[4745]: I0319 00:14:15.606450 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:14:15 crc kubenswrapper[4745]: I0319 00:14:15.606742 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.622512 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.623506 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hhfzg" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" containerID="cri-o://0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.632182 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.632538 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q9zn6" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" containerID="cri-o://1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.637541 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.637977 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" containerID="cri-o://1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.651106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.652398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.654792 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.661386 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.661717 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-75vmv" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" containerID="cri-o://e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.662275 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mtjq5" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" containerID="cri-o://97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" gracePeriod=30 Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.671914 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789326 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.789414 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890861 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890928 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.890945 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.892584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.898270 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.922741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lkcr\" (UniqueName: \"kubernetes.io/projected/9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06-kube-api-access-9lkcr\") pod \"marketplace-operator-79b997595-hcgn8\" (UID: \"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06\") " pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:23 crc kubenswrapper[4745]: I0319 00:14:23.979481 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.127527 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.131219 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.133603 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.144917 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.145664 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.294921 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.294978 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295042 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295060 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295090 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295107 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295127 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295141 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295158 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") pod \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\" (UID: \"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295194 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295231 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295248 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") pod \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\" (UID: \"e2cfb22a-6632-4e35-8145-6e9815e6e76f\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295268 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") pod \"c21b8175-025a-4d91-ad43-389dbad40846\" (UID: \"c21b8175-025a-4d91-ad43-389dbad40846\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") pod \"2c3c406d-9994-4629-b585-4d145b1e04aa\" (UID: \"2c3c406d-9994-4629-b585-4d145b1e04aa\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.295303 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") pod \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\" (UID: \"0c1d22d3-b584-4622-856c-b531a5d1ad5d\") " Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.296739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities" (OuterVolumeSpecName: "utilities") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.297684 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities" (OuterVolumeSpecName: "utilities") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.298220 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.298723 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities" (OuterVolumeSpecName: "utilities") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.299996 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities" (OuterVolumeSpecName: "utilities") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.301601 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj" (OuterVolumeSpecName: "kube-api-access-rttcj") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "kube-api-access-rttcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.302337 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk" (OuterVolumeSpecName: "kube-api-access-rdqmk") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "kube-api-access-rdqmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.303903 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp" (OuterVolumeSpecName: "kube-api-access-lhwtp") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "kube-api-access-lhwtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.304536 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f" (OuterVolumeSpecName: "kube-api-access-sqj2f") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "kube-api-access-sqj2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.312537 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh" (OuterVolumeSpecName: "kube-api-access-82wwh") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "kube-api-access-82wwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.315562 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "e2cfb22a-6632-4e35-8145-6e9815e6e76f" (UID: "e2cfb22a-6632-4e35-8145-6e9815e6e76f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.326569 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c3c406d-9994-4629-b585-4d145b1e04aa" (UID: "2c3c406d-9994-4629-b585-4d145b1e04aa"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.363660 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0c1d22d3-b584-4622-856c-b531a5d1ad5d" (UID: "0c1d22d3-b584-4622-856c-b531a5d1ad5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.367543 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c21b8175-025a-4d91-ad43-389dbad40846" (UID: "c21b8175-025a-4d91-ad43-389dbad40846"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396540 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396597 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdqmk\" (UniqueName: \"kubernetes.io/projected/c21b8175-025a-4d91-ad43-389dbad40846-kube-api-access-rdqmk\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396613 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396625 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396637 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396647 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rttcj\" (UniqueName: \"kubernetes.io/projected/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-kube-api-access-rttcj\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396658 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c3c406d-9994-4629-b585-4d145b1e04aa-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396669 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82wwh\" (UniqueName: \"kubernetes.io/projected/0c1d22d3-b584-4622-856c-b531a5d1ad5d-kube-api-access-82wwh\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396681 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhwtp\" (UniqueName: \"kubernetes.io/projected/e2cfb22a-6632-4e35-8145-6e9815e6e76f-kube-api-access-lhwtp\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396691 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c21b8175-025a-4d91-ad43-389dbad40846-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396701 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqj2f\" (UniqueName: \"kubernetes.io/projected/2c3c406d-9994-4629-b585-4d145b1e04aa-kube-api-access-sqj2f\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396711 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0c1d22d3-b584-4622-856c-b531a5d1ad5d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396723 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.396733 4745 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/e2cfb22a-6632-4e35-8145-6e9815e6e76f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.442653 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" (UID: "71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.468237 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hcgn8"] Mar 19 00:14:24 crc kubenswrapper[4745]: W0319 00:14:24.472004 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9dc2e0fe_a8bd_4a4f_9ee2_1685bc395d06.slice/crio-eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7 WatchSource:0}: Error finding container eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7: Status 404 returned error can't find the container with id eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495198 4745 generic.go:334] "Generic (PLEG): container finished" podID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495248 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495309 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hhfzg" event={"ID":"0c1d22d3-b584-4622-856c-b531a5d1ad5d","Type":"ContainerDied","Data":"1385db0e9218cd6a53bd844f3c99f4797ac5eed30c3c8451117c54ef69a818d9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495327 4745 scope.go:117] "RemoveContainer" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.495266 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hhfzg" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.497044 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" event={"ID":"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06","Type":"ContainerStarted","Data":"eee886943feebfac8ae303664dfd5c792fd6a0b6ca7a041aac08f33921ca9fc7"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.497564 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500097 4745 generic.go:334] "Generic (PLEG): container finished" podID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500134 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.500412 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-qn8c4" event={"ID":"e2cfb22a-6632-4e35-8145-6e9815e6e76f","Type":"ContainerDied","Data":"1ba6e1f38cebd60b48f169bf9b16cb68b35cbd4232e7b7482a4e5339486334e0"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506078 4745 generic.go:334] "Generic (PLEG): container finished" podID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506269 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mtjq5" event={"ID":"2c3c406d-9994-4629-b585-4d145b1e04aa","Type":"ContainerDied","Data":"3b83c823529db11222e49bc92d4be77a23b45eab02e962cb6761c2af219fd176"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.506312 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mtjq5" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508656 4745 generic.go:334] "Generic (PLEG): container finished" podID="c21b8175-025a-4d91-ad43-389dbad40846" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.508964 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q9zn6" event={"ID":"c21b8175-025a-4d91-ad43-389dbad40846","Type":"ContainerDied","Data":"44402f88db283736632f4638c8a012502a669c0564707bb1cc601815d7a854a9"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.509021 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q9zn6" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513440 4745 scope.go:117] "RemoveContainer" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513495 4745 generic.go:334] "Generic (PLEG): container finished" podID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" exitCode=0 Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513527 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513553 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-75vmv" event={"ID":"71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0","Type":"ContainerDied","Data":"2cd3908399145e2519a565664cfaff071ac8bf459660c66c4bd6a1d4b7d2532a"} Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.513611 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-75vmv" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.549907 4745 scope.go:117] "RemoveContainer" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.585822 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.585962 4745 scope.go:117] "RemoveContainer" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.586385 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": container with ID starting with 0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9 not found: ID does not exist" containerID="0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586433 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9"} err="failed to get container status \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": rpc error: code = NotFound desc = could not find container \"0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9\": container with ID starting with 0b31c41eca3f11d8f8a5dfa4e4fbe5142705b4b0217f2bfb582ee9c4fa46b5e9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586466 4745 scope.go:117] "RemoveContainer" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.586900 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": container with ID starting with abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3 not found: ID does not exist" containerID="abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586926 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3"} err="failed to get container status \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": rpc error: code = NotFound desc = could not find container \"abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3\": container with ID starting with abf27c08f374488fef1da5e0c2c08dfb69f9ba4d337684678f3214b51391a3d3 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.586941 4745 scope.go:117] "RemoveContainer" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.587470 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": container with ID starting with aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db not found: ID does not exist" containerID="aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.587625 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db"} err="failed to get container status \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": rpc error: code = NotFound desc = could not find container \"aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db\": container with ID starting with aa118a5efc6839b48bee22e1bea6eaed58d0bb2d3385559a2e8e4823abcf94db not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.587709 4745 scope.go:117] "RemoveContainer" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.602806 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hhfzg"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.608166 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.621217 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.633346 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q9zn6"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.652942 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.664295 4745 scope.go:117] "RemoveContainer" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.678536 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": container with ID starting with 1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243 not found: ID does not exist" containerID="1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.678621 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243"} err="failed to get container status \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": rpc error: code = NotFound desc = could not find container \"1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243\": container with ID starting with 1f5435b82b6e2a731a1dd8a144526120fe26996dcd8c12dcebab527a4e469243 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.678655 4745 scope.go:117] "RemoveContainer" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.679180 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": container with ID starting with afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8 not found: ID does not exist" containerID="afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.679209 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8"} err="failed to get container status \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": rpc error: code = NotFound desc = could not find container \"afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8\": container with ID starting with afaab62b0b3b223b15fd2b097735b65a864ff0e04ba91c143dc1c70b64c517d8 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.679239 4745 scope.go:117] "RemoveContainer" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.705976 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-qn8c4"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.711980 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.716614 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mtjq5"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.722157 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.725503 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-75vmv"] Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.726980 4745 scope.go:117] "RemoveContainer" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.742576 4745 scope.go:117] "RemoveContainer" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.760497 4745 scope.go:117] "RemoveContainer" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761179 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": container with ID starting with 97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42 not found: ID does not exist" containerID="97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761216 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42"} err="failed to get container status \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": rpc error: code = NotFound desc = could not find container \"97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42\": container with ID starting with 97c1ababbe66d2d2dddbc20327394d16bba0d052c99f7629c269d5a244adbe42 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761241 4745 scope.go:117] "RemoveContainer" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761580 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": container with ID starting with 6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4 not found: ID does not exist" containerID="6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761608 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4"} err="failed to get container status \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": rpc error: code = NotFound desc = could not find container \"6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4\": container with ID starting with 6daacb30074f4d4cde03193a349fbdcf3015d96654f31169bf00af098365b6d4 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761646 4745 scope.go:117] "RemoveContainer" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.761917 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": container with ID starting with 595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1 not found: ID does not exist" containerID="595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761941 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1"} err="failed to get container status \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": rpc error: code = NotFound desc = could not find container \"595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1\": container with ID starting with 595a181e23a7ff6d14d9fff47ba97a1cdbb7cfa85df011f3df9beb6e8c936dd1 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.761953 4745 scope.go:117] "RemoveContainer" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.808764 4745 scope.go:117] "RemoveContainer" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.826680 4745 scope.go:117] "RemoveContainer" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.846710 4745 scope.go:117] "RemoveContainer" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.849252 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": container with ID starting with 1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f not found: ID does not exist" containerID="1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.849294 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f"} err="failed to get container status \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": rpc error: code = NotFound desc = could not find container \"1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f\": container with ID starting with 1f647750788b707ce38cdba6f68724d92fb52543220b726a795c8cc598d5d49f not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.849321 4745 scope.go:117] "RemoveContainer" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.850281 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": container with ID starting with 3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9 not found: ID does not exist" containerID="3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850352 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9"} err="failed to get container status \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": rpc error: code = NotFound desc = could not find container \"3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9\": container with ID starting with 3d331df620bf4375e56bab77de838b852a4bd29a2d680b985a0933074bba2fd9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850400 4745 scope.go:117] "RemoveContainer" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.850825 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": container with ID starting with 80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0 not found: ID does not exist" containerID="80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850864 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0"} err="failed to get container status \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": rpc error: code = NotFound desc = could not find container \"80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0\": container with ID starting with 80e7510d85dacab30d6c4afcfe08e3720c01bf5548334617566ce3b3b073acb0 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.850899 4745 scope.go:117] "RemoveContainer" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.865300 4745 scope.go:117] "RemoveContainer" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.885068 4745 scope.go:117] "RemoveContainer" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.906898 4745 scope.go:117] "RemoveContainer" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.907509 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": container with ID starting with e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a not found: ID does not exist" containerID="e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.907568 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a"} err="failed to get container status \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": rpc error: code = NotFound desc = could not find container \"e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a\": container with ID starting with e9190bf4db9a92cba47428c6df8df21461ccc2a1605b78ec9794aefe3b96a73a not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.907609 4745 scope.go:117] "RemoveContainer" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.908302 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": container with ID starting with 568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9 not found: ID does not exist" containerID="568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.908473 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9"} err="failed to get container status \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": rpc error: code = NotFound desc = could not find container \"568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9\": container with ID starting with 568ae577a19a208eccc9df7a07a0c2f6dc8724087a033882fae93913b91fe3b9 not found: ID does not exist" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.908644 4745 scope.go:117] "RemoveContainer" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: E0319 00:14:24.909195 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": container with ID starting with 478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4 not found: ID does not exist" containerID="478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4" Mar 19 00:14:24 crc kubenswrapper[4745]: I0319 00:14:24.909233 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4"} err="failed to get container status \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": rpc error: code = NotFound desc = could not find container \"478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4\": container with ID starting with 478abf0e69346aa31009af703a50b16eb097ca868b4938756cdb9ce4d25085f4 not found: ID does not exist" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.527490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" event={"ID":"9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06","Type":"ContainerStarted","Data":"55f548fd7298387a80f438dd5f717d328157496f518f27067bafdbde5b40e715"} Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.527947 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.532478 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.546522 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hcgn8" podStartSLOduration=2.546496338 podStartE2EDuration="2.546496338s" podCreationTimestamp="2026-03-19 00:14:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:14:25.546257892 +0000 UTC m=+430.084453023" watchObservedRunningTime="2026-03-19 00:14:25.546496338 +0000 UTC m=+430.084691469" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832030 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832218 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832230 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832239 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832245 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832254 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832263 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832270 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832285 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832291 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832300 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.832306 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.832313 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833298 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833314 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833320 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833328 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833333 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833344 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833349 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="extract-utilities" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833357 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833362 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833370 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833375 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="extract-content" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833386 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833391 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833495 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833511 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c21b8175-025a-4d91-ad43-389dbad40846" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833519 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833525 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833532 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833540 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" containerName="registry-server" Mar 19 00:14:25 crc kubenswrapper[4745]: E0319 00:14:25.833625 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.833634 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" containerName="marketplace-operator" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.834244 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.837530 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.846081 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924740 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924812 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:25 crc kubenswrapper[4745]: I0319 00:14:25.924948 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026494 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026565 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.026620 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.027546 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.027834 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.028947 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.030223 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.033360 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.044369 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.055792 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"redhat-marketplace-j587v\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127724 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.127864 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.144622 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c1d22d3-b584-4622-856c-b531a5d1ad5d" path="/var/lib/kubelet/pods/0c1d22d3-b584-4622-856c-b531a5d1ad5d/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.145393 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3c406d-9994-4629-b585-4d145b1e04aa" path="/var/lib/kubelet/pods/2c3c406d-9994-4629-b585-4d145b1e04aa/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.146049 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0" path="/var/lib/kubelet/pods/71d27c5f-8b77-40dd-b3e8-9544bf9f7ec0/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.147301 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c21b8175-025a-4d91-ad43-389dbad40846" path="/var/lib/kubelet/pods/c21b8175-025a-4d91-ad43-389dbad40846/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.148017 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2cfb22a-6632-4e35-8145-6e9815e6e76f" path="/var/lib/kubelet/pods/e2cfb22a-6632-4e35-8145-6e9815e6e76f/volumes" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.156773 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.228843 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230307 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230370 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230394 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.230632 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.249817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"certified-operators-vj7rp\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.345777 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.355644 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.535325 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" exitCode=0 Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.535384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae"} Mar 19 00:14:26 crc kubenswrapper[4745]: I0319 00:14:26.536522 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"5e90ccfe8369ff143297fb00f49862c706dc2eb6a69c3dc1f5670ef331a15a02"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.078808 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.541903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544393 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" exitCode=0 Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544497 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc"} Mar 19 00:14:27 crc kubenswrapper[4745]: I0319 00:14:27.544527 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerStarted","Data":"c186af9e1f5669e5491c37e499f3a6a8a28b64cddfa66b87effddfaec8dbd826"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.238674 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.239811 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.246298 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.249271 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.371925 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.371987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.372040 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.431128 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.432251 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.435774 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.449505 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472808 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472920 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.472960 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.473696 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-catalog-content\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.473782 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-utilities\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.494351 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-627lx\" (UniqueName: \"kubernetes.io/projected/0ff66d22-2b4c-4f11-acfe-06173ee9a07e-kube-api-access-627lx\") pod \"community-operators-pq2gm\" (UID: \"0ff66d22-2b4c-4f11-acfe-06173ee9a07e\") " pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.551693 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" exitCode=0 Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.551784 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.554721 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" exitCode=0 Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.554774 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0"} Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.564299 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574108 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574174 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.574230 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.675368 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.676198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-catalog-content\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.677031 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-utilities\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.696432 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjlc2\" (UniqueName: \"kubernetes.io/projected/e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5-kube-api-access-fjlc2\") pod \"redhat-operators-6ddfl\" (UID: \"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5\") " pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.758016 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.954658 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-pq2gm"] Mar 19 00:14:28 crc kubenswrapper[4745]: I0319 00:14:28.966913 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6ddfl"] Mar 19 00:14:28 crc kubenswrapper[4745]: W0319 00:14:28.976962 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ff66d22_2b4c_4f11_acfe_06173ee9a07e.slice/crio-f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0 WatchSource:0}: Error finding container f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0: Status 404 returned error can't find the container with id f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0 Mar 19 00:14:28 crc kubenswrapper[4745]: W0319 00:14:28.981130 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5ffb13a_ff99_429d_bfdc_cdd2a43b90c5.slice/crio-90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878 WatchSource:0}: Error finding container 90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878: Status 404 returned error can't find the container with id 90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.562307 4745 generic.go:334] "Generic (PLEG): container finished" podID="e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5" containerID="84d24d50c83144d9d3b72ba47fd55ce8d6a46eea80ce2a350b7057cb367e05e7" exitCode=0 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.563641 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerDied","Data":"84d24d50c83144d9d3b72ba47fd55ce8d6a46eea80ce2a350b7057cb367e05e7"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.564196 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerStarted","Data":"90c05f87712d8c3058349483ac97f96cec39a3d4ee4fe2ecdc7818841be4f878"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.570517 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerStarted","Data":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.572728 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerStarted","Data":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574585 4745 generic.go:334] "Generic (PLEG): container finished" podID="0ff66d22-2b4c-4f11-acfe-06173ee9a07e" containerID="e1a1ce8f7fbe5a5d7423efce319a2083c3af785662842d0075f56e3632d01311" exitCode=0 Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574624 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerDied","Data":"e1a1ce8f7fbe5a5d7423efce319a2083c3af785662842d0075f56e3632d01311"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.574654 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"f72e647bd76072242fa00811de7ea6fdad9d41ecc43e4826ec211c23023a16b0"} Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.632732 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vj7rp" podStartSLOduration=2.220038687 podStartE2EDuration="3.632710551s" podCreationTimestamp="2026-03-19 00:14:26 +0000 UTC" firstStartedPulling="2026-03-19 00:14:27.546474736 +0000 UTC m=+432.084669867" lastFinishedPulling="2026-03-19 00:14:28.95914661 +0000 UTC m=+433.497341731" observedRunningTime="2026-03-19 00:14:29.610156047 +0000 UTC m=+434.148351178" watchObservedRunningTime="2026-03-19 00:14:29.632710551 +0000 UTC m=+434.170905682" Mar 19 00:14:29 crc kubenswrapper[4745]: I0319 00:14:29.649544 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j587v" podStartSLOduration=2.138866935 podStartE2EDuration="4.649522982s" podCreationTimestamp="2026-03-19 00:14:25 +0000 UTC" firstStartedPulling="2026-03-19 00:14:26.540233895 +0000 UTC m=+431.078429026" lastFinishedPulling="2026-03-19 00:14:29.050889942 +0000 UTC m=+433.589085073" observedRunningTime="2026-03-19 00:14:29.645319099 +0000 UTC m=+434.183514220" watchObservedRunningTime="2026-03-19 00:14:29.649522982 +0000 UTC m=+434.187718123" Mar 19 00:14:30 crc kubenswrapper[4745]: I0319 00:14:30.591679 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.616699 4745 generic.go:334] "Generic (PLEG): container finished" podID="e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5" containerID="dcaff950c5d146e7e21523621dbc11cbe640f22926270d35b3c7e51b5bca76c2" exitCode=0 Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.617388 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerDied","Data":"dcaff950c5d146e7e21523621dbc11cbe640f22926270d35b3c7e51b5bca76c2"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.630738 4745 generic.go:334] "Generic (PLEG): container finished" podID="0ff66d22-2b4c-4f11-acfe-06173ee9a07e" containerID="0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52" exitCode=0 Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.630776 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerDied","Data":"0c7790717956205103ceba96837b625aab857083930a1640d4c0abb4cb000f52"} Mar 19 00:14:31 crc kubenswrapper[4745]: I0319 00:14:31.988101 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-wrzf6" Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.048706 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.639428 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-pq2gm" event={"ID":"0ff66d22-2b4c-4f11-acfe-06173ee9a07e","Type":"ContainerStarted","Data":"d0adfba1067338989c5d93f619f49c1e3c7884671c15c26bb96ac736763d46a5"} Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.646294 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6ddfl" event={"ID":"e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5","Type":"ContainerStarted","Data":"184bd62fbc5252a7e817993caec127eccacb7441498ad10ba91863fede3ce124"} Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.660990 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-pq2gm" podStartSLOduration=2.186050609 podStartE2EDuration="4.660971275s" podCreationTimestamp="2026-03-19 00:14:28 +0000 UTC" firstStartedPulling="2026-03-19 00:14:29.57578992 +0000 UTC m=+434.113985061" lastFinishedPulling="2026-03-19 00:14:32.050710596 +0000 UTC m=+436.588905727" observedRunningTime="2026-03-19 00:14:32.657282188 +0000 UTC m=+437.195477319" watchObservedRunningTime="2026-03-19 00:14:32.660971275 +0000 UTC m=+437.199166406" Mar 19 00:14:32 crc kubenswrapper[4745]: I0319 00:14:32.677291 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6ddfl" podStartSLOduration=2.148559322 podStartE2EDuration="4.67727351s" podCreationTimestamp="2026-03-19 00:14:28 +0000 UTC" firstStartedPulling="2026-03-19 00:14:29.565756793 +0000 UTC m=+434.103951924" lastFinishedPulling="2026-03-19 00:14:32.094470971 +0000 UTC m=+436.632666112" observedRunningTime="2026-03-19 00:14:32.673991797 +0000 UTC m=+437.212186938" watchObservedRunningTime="2026-03-19 00:14:32.67727351 +0000 UTC m=+437.215468641" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.157844 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.158249 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.202865 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.356792 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.356864 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.400963 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.707934 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:14:36 crc kubenswrapper[4745]: I0319 00:14:36.710284 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.565354 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.565697 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.607896 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.716324 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-pq2gm" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.758462 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.758678 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:38 crc kubenswrapper[4745]: I0319 00:14:38.795842 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:39 crc kubenswrapper[4745]: I0319 00:14:39.716182 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6ddfl" Mar 19 00:14:45 crc kubenswrapper[4745]: I0319 00:14:45.606205 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:14:45 crc kubenswrapper[4745]: I0319 00:14:45.606569 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.093487 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" containerID="cri-o://81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" gracePeriod=30 Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.435747 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.530809 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.530956 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531355 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531433 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.531782 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.532627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.532761 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533124 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533167 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") pod \"b246ac53-9c42-426b-97da-3ca4075766ab\" (UID: \"b246ac53-9c42-426b-97da-3ca4075766ab\") " Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533665 4745 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.533691 4745 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b246ac53-9c42-426b-97da-3ca4075766ab-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.539515 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.541009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.541909 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc" (OuterVolumeSpecName: "kube-api-access-z6wxc") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "kube-api-access-z6wxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.542928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.544783 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.551934 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b246ac53-9c42-426b-97da-3ca4075766ab" (UID: "b246ac53-9c42-426b-97da-3ca4075766ab"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635359 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6wxc\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-kube-api-access-z6wxc\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635409 4745 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b246ac53-9c42-426b-97da-3ca4075766ab-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635440 4745 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635453 4745 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b246ac53-9c42-426b-97da-3ca4075766ab-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.635467 4745 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b246ac53-9c42-426b-97da-3ca4075766ab-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793617 4745 generic.go:334] "Generic (PLEG): container finished" podID="b246ac53-9c42-426b-97da-3ca4075766ab" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" exitCode=0 Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793661 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerDied","Data":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793696 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" event={"ID":"b246ac53-9c42-426b-97da-3ca4075766ab","Type":"ContainerDied","Data":"de743d9871d9e519d057ac18763fdb10aeceb0154e79a293ccaea85445d780d3"} Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793719 4745 scope.go:117] "RemoveContainer" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.793748 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ggt62" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.810965 4745 scope.go:117] "RemoveContainer" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: E0319 00:14:57.811511 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": container with ID starting with 81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0 not found: ID does not exist" containerID="81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.811559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0"} err="failed to get container status \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": rpc error: code = NotFound desc = could not find container \"81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0\": container with ID starting with 81a8485ab6bdaa29774fde7cf5a119fa8f9912543c5c93a6a3e8b6a5170d4fd0 not found: ID does not exist" Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.828841 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:57 crc kubenswrapper[4745]: I0319 00:14:57.832742 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ggt62"] Mar 19 00:14:58 crc kubenswrapper[4745]: I0319 00:14:58.145682 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" path="/var/lib/kubelet/pods/b246ac53-9c42-426b-97da-3ca4075766ab/volumes" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.145970 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: E0319 00:15:00.146154 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146167 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146269 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b246ac53-9c42-426b-97da-3ca4075766ab" containerName="registry" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.146654 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.149938 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.150124 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.152504 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267377 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267456 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.267495 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368164 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368260 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.368294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.369430 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.375325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.386799 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"collect-profiles-29564655-tq8kj\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.467015 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.714927 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj"] Mar 19 00:15:00 crc kubenswrapper[4745]: I0319 00:15:00.812111 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerStarted","Data":"051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c"} Mar 19 00:15:01 crc kubenswrapper[4745]: I0319 00:15:01.818428 4745 generic.go:334] "Generic (PLEG): container finished" podID="9310ac17-728f-450e-9e05-4159ab257626" containerID="3418c0630502ef3e4a53904e75f1c29d16c5a3f13f1d69499262a568825926a3" exitCode=0 Mar 19 00:15:01 crc kubenswrapper[4745]: I0319 00:15:01.818480 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerDied","Data":"3418c0630502ef3e4a53904e75f1c29d16c5a3f13f1d69499262a568825926a3"} Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.030116 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099173 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.099323 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") pod \"9310ac17-728f-450e-9e05-4159ab257626\" (UID: \"9310ac17-728f-450e-9e05-4159ab257626\") " Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.100416 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume" (OuterVolumeSpecName: "config-volume") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.104971 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.105771 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl" (OuterVolumeSpecName: "kube-api-access-kfbsl") pod "9310ac17-728f-450e-9e05-4159ab257626" (UID: "9310ac17-728f-450e-9e05-4159ab257626"). InnerVolumeSpecName "kube-api-access-kfbsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200677 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9310ac17-728f-450e-9e05-4159ab257626-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200735 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfbsl\" (UniqueName: \"kubernetes.io/projected/9310ac17-728f-450e-9e05-4159ab257626-kube-api-access-kfbsl\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.200750 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9310ac17-728f-450e-9e05-4159ab257626-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830373 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" event={"ID":"9310ac17-728f-450e-9e05-4159ab257626","Type":"ContainerDied","Data":"051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c"} Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830407 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="051f03921ea5e42b6a343d7a56bd3fd8fbad5e12835e07f347b447cf3fcb073c" Mar 19 00:15:03 crc kubenswrapper[4745]: I0319 00:15:03.830442 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564655-tq8kj" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.606018 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.607099 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.607209 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.608757 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.609029 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" gracePeriod=600 Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911446 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" exitCode=0 Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10"} Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} Mar 19 00:15:15 crc kubenswrapper[4745]: I0319 00:15:15.911864 4745 scope.go:117] "RemoveContainer" containerID="e781ee5ae6c9d9b5733634852a564c2eae2e0b99cb6153a6d58f2144039bf574" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.147722 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: E0319 00:16:00.149411 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.149483 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.149655 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9310ac17-728f-450e-9e05-4159ab257626" containerName="collect-profiles" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.150086 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.150306 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.153916 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.154098 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.154549 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.336892 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.441950 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.463183 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"auto-csr-approver-29564656-7jksw\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.471092 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.666226 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:16:00 crc kubenswrapper[4745]: I0319 00:16:00.692274 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:16:01 crc kubenswrapper[4745]: I0319 00:16:01.176360 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerStarted","Data":"1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da"} Mar 19 00:16:02 crc kubenswrapper[4745]: I0319 00:16:02.182253 4745 generic.go:334] "Generic (PLEG): container finished" podID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerID="4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b" exitCode=0 Mar 19 00:16:02 crc kubenswrapper[4745]: I0319 00:16:02.182297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerDied","Data":"4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b"} Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.385070 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.579086 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") pod \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\" (UID: \"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1\") " Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.589449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6" (OuterVolumeSpecName: "kube-api-access-6dqk6") pod "f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" (UID: "f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1"). InnerVolumeSpecName "kube-api-access-6dqk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:16:03 crc kubenswrapper[4745]: I0319 00:16:03.680680 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dqk6\" (UniqueName: \"kubernetes.io/projected/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1-kube-api-access-6dqk6\") on node \"crc\" DevicePath \"\"" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564656-7jksw" event={"ID":"f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1","Type":"ContainerDied","Data":"1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da"} Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195411 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a176777ee24fdbcfa0fdb9bbe56ac8ed1ef043829a0399e6cd32adf043133da" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.195471 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564656-7jksw" Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.448503 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:16:04 crc kubenswrapper[4745]: I0319 00:16:04.455019 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564650-7k6ld"] Mar 19 00:16:06 crc kubenswrapper[4745]: I0319 00:16:06.146844 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d14d18f5-0177-4458-8ea3-b266cc96d658" path="/var/lib/kubelet/pods/d14d18f5-0177-4458-8ea3-b266cc96d658/volumes" Mar 19 00:17:15 crc kubenswrapper[4745]: I0319 00:17:15.606759 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:17:15 crc kubenswrapper[4745]: I0319 00:17:15.607465 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:17:45 crc kubenswrapper[4745]: I0319 00:17:45.606586 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:17:45 crc kubenswrapper[4745]: I0319 00:17:45.608525 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.154524 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: E0319 00:18:00.155482 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.155498 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.155614 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" containerName="oc" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.156066 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.162819 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.172284 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.172796 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.173024 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.313353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.414899 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.434688 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"auto-csr-approver-29564658-6vdd5\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.489826 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.684699 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:18:00 crc kubenswrapper[4745]: I0319 00:18:00.959812 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerStarted","Data":"3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac"} Mar 19 00:18:01 crc kubenswrapper[4745]: I0319 00:18:01.966836 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerStarted","Data":"91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c"} Mar 19 00:18:01 crc kubenswrapper[4745]: I0319 00:18:01.982164 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" podStartSLOduration=1.08076951 podStartE2EDuration="1.982140234s" podCreationTimestamp="2026-03-19 00:18:00 +0000 UTC" firstStartedPulling="2026-03-19 00:18:00.695021216 +0000 UTC m=+645.233216347" lastFinishedPulling="2026-03-19 00:18:01.59639194 +0000 UTC m=+646.134587071" observedRunningTime="2026-03-19 00:18:01.980361689 +0000 UTC m=+646.518556830" watchObservedRunningTime="2026-03-19 00:18:01.982140234 +0000 UTC m=+646.520335365" Mar 19 00:18:02 crc kubenswrapper[4745]: I0319 00:18:02.974284 4745 generic.go:334] "Generic (PLEG): container finished" podID="7807a7d0-ff52-4a76-b083-19eca144b510" containerID="91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c" exitCode=0 Mar 19 00:18:02 crc kubenswrapper[4745]: I0319 00:18:02.974378 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerDied","Data":"91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c"} Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.224603 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.365358 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") pod \"7807a7d0-ff52-4a76-b083-19eca144b510\" (UID: \"7807a7d0-ff52-4a76-b083-19eca144b510\") " Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.373260 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f" (OuterVolumeSpecName: "kube-api-access-9d94f") pod "7807a7d0-ff52-4a76-b083-19eca144b510" (UID: "7807a7d0-ff52-4a76-b083-19eca144b510"). InnerVolumeSpecName "kube-api-access-9d94f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.467412 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9d94f\" (UniqueName: \"kubernetes.io/projected/7807a7d0-ff52-4a76-b083-19eca144b510-kube-api-access-9d94f\") on node \"crc\" DevicePath \"\"" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.988963 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" event={"ID":"7807a7d0-ff52-4a76-b083-19eca144b510","Type":"ContainerDied","Data":"3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac"} Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.989966 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3165a41ad3cd4db1417a502aaf6684541dbe44ff096ef9becfcf62199c1f7dac" Mar 19 00:18:04 crc kubenswrapper[4745]: I0319 00:18:04.988994 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564658-6vdd5" Mar 19 00:18:05 crc kubenswrapper[4745]: I0319 00:18:05.046328 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:18:05 crc kubenswrapper[4745]: I0319 00:18:05.050231 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564652-nhhsh"] Mar 19 00:18:06 crc kubenswrapper[4745]: I0319 00:18:06.147594 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ef55829-c74d-4c78-b9b9-1c3ea05456e9" path="/var/lib/kubelet/pods/9ef55829-c74d-4c78-b9b9-1c3ea05456e9/volumes" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.606250 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.607169 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.607252 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.608451 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:18:15 crc kubenswrapper[4745]: I0319 00:18:15.608559 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" gracePeriod=600 Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.083811 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" exitCode=0 Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.083982 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d"} Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.084354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} Mar 19 00:18:16 crc kubenswrapper[4745]: I0319 00:18:16.084385 4745 scope.go:117] "RemoveContainer" containerID="de10a1d38d98124c46ca0c82dd88e28606d5f5ae568f904a533071011c2c4c10" Mar 19 00:18:34 crc kubenswrapper[4745]: I0319 00:18:34.631142 4745 scope.go:117] "RemoveContainer" containerID="5516b8a0a4bc7aafa493bd87254867dd7254eae5e71faee49575516dfd155284" Mar 19 00:18:34 crc kubenswrapper[4745]: I0319 00:18:34.671986 4745 scope.go:117] "RemoveContainer" containerID="76cb550291e5cd7aae935eea1a8dd025dfbc6f11748c2597964f9ad53d8ac6b0" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.079016 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.086138 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" containerID="cri-o://9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.086993 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" containerID="cri-o://1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087363 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087469 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" containerID="cri-o://b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087551 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" containerID="cri-o://f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087599 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" containerID="cri-o://6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.087744 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" containerID="cri-o://d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.158356 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" containerID="cri-o://878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" gracePeriod=30 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.455656 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.459023 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-acl-logging/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.459552 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-controller/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.460368 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.509495 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510224 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/1.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510255 4745 generic.go:334] "Generic (PLEG): container finished" podID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" exitCode=2 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510316 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerDied","Data":"24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510362 4745 scope.go:117] "RemoveContainer" containerID="486f90c5b94f241c425978c765c7685a7678b6df2f0ffffd2583ae2ed3e3d915" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.510942 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.511243 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519191 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-64vfm"] Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519537 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519560 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519577 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519588 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519603 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kubecfg-setup" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519611 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kubecfg-setup" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519627 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519635 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519645 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519656 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519667 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519676 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519686 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519696 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519705 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519714 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519728 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519736 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519745 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519753 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519772 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519781 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.519798 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519806 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519957 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" containerName="oc" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519973 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="northd" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519984 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.519994 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="sbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520006 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520017 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520029 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="nbdb" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520039 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520048 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520060 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovn-acl-logging" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520071 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="kube-rbac-proxy-node" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.520197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520206 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520323 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520337 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.520459 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.520470 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="21835778-c889-4031-b630-586c00f200f9" containerName="ovnkube-controller" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.522653 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.523209 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovnkube-controller/3.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.527796 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-acl-logging/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.528369 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-w2988_21835778-c889-4031-b630-586c00f200f9/ovn-controller/0.log" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529071 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529118 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529128 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529142 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529153 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529162 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" exitCode=0 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529171 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" exitCode=143 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529181 4745 generic.go:334] "Generic (PLEG): container finished" podID="21835778-c889-4031-b630-586c00f200f9" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" exitCode=143 Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529211 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529225 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529250 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529271 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529294 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529318 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529332 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529370 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529433 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529443 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529471 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529483 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529490 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529498 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529506 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529516 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529522 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529533 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529546 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529555 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529566 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529573 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529579 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529588 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529596 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529606 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529614 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529622 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529635 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529649 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529659 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529667 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529674 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529682 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529689 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529696 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529704 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529711 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529719 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529730 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-w2988" event={"ID":"21835778-c889-4031-b630-586c00f200f9","Type":"ContainerDied","Data":"71f66e30efa5016ee954a4cb19c576186a237cdc85750ce0837af353f57d6b56"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529743 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529751 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529759 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529766 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529772 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529780 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529856 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529869 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529910 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.529921 4745 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.567787 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568109 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568179 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568207 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568240 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568269 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568305 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568338 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568354 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568398 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568414 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568437 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568451 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568471 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568508 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568550 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568573 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568613 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568630 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") pod \"21835778-c889-4031-b630-586c00f200f9\" (UID: \"21835778-c889-4031-b630-586c00f200f9\") " Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568896 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569090 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log" (OuterVolumeSpecName: "node-log") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569120 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569401 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569440 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569467 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569474 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569510 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569523 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash" (OuterVolumeSpecName: "host-slash") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569537 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569555 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket" (OuterVolumeSpecName: "log-socket") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569579 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.569958 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.570086 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.570139 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.568949 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.574764 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.575588 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz" (OuterVolumeSpecName: "kube-api-access-kwglz") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "kube-api-access-kwglz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.583969 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "21835778-c889-4031-b630-586c00f200f9" (UID: "21835778-c889-4031-b630-586c00f200f9"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.588934 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.607560 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.620777 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.638717 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.653631 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.666579 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670508 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670592 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670611 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.670725 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671013 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671154 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671183 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671235 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671270 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671298 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671358 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671383 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671416 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671596 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671646 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671827 4745 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671849 4745 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671860 4745 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671873 4745 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/21835778-c889-4031-b630-586c00f200f9-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671904 4745 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671917 4745 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671933 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671945 4745 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671959 4745 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671974 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwglz\" (UniqueName: \"kubernetes.io/projected/21835778-c889-4031-b630-586c00f200f9-kube-api-access-kwglz\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671987 4745 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.671999 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672011 4745 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/21835778-c889-4031-b630-586c00f200f9-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672023 4745 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672034 4745 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672045 4745 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672058 4745 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672069 4745 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672078 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.672090 4745 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/21835778-c889-4031-b630-586c00f200f9-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.679034 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.692628 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.708762 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724121 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.724598 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724631 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.724654 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725053 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725109 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725150 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725578 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725612 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725632 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.725970 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.725991 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726008 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.726331 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726381 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726412 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.726951 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726978 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.726999 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727250 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727277 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727291 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727553 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727587 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727611 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.727861 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727948 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.727972 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: E0319 00:19:19.728268 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728296 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728315 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728559 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728582 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728831 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.728851 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729192 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729219 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729438 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729465 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729728 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.729755 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730038 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730056 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730230 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730249 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730535 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730552 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730769 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.730797 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731077 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731098 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731641 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.731677 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732015 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732040 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732333 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732352 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732598 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732629 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732965 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.732994 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733276 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733302 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733558 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.733590 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734063 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734090 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734346 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734373 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734646 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734676 4745 scope.go:117] "RemoveContainer" containerID="878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734945 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec"} err="failed to get container status \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": rpc error: code = NotFound desc = could not find container \"878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec\": container with ID starting with 878fdcc9719ce01acbb794dad833b0b3192cd04245b2ec18310230d3bf63c5ec not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.734967 4745 scope.go:117] "RemoveContainer" containerID="4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735210 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6"} err="failed to get container status \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": rpc error: code = NotFound desc = could not find container \"4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6\": container with ID starting with 4f98e55742fa40904602d504b116d0611e659ac0b5cbcb2fafa9a0d5d8d1a7b6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735235 4745 scope.go:117] "RemoveContainer" containerID="1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735585 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16"} err="failed to get container status \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": rpc error: code = NotFound desc = could not find container \"1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16\": container with ID starting with 1c92dca9d23b9e3861c74e8c4007a16e7ebdd74dafb10736908f6968eefaca16 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735614 4745 scope.go:117] "RemoveContainer" containerID="9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735971 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464"} err="failed to get container status \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": rpc error: code = NotFound desc = could not find container \"9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464\": container with ID starting with 9c87bef5819375621418ba519d591c40d19dc2e0a8908bd332b0796fac2a0464 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.735999 4745 scope.go:117] "RemoveContainer" containerID="b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736324 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719"} err="failed to get container status \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": rpc error: code = NotFound desc = could not find container \"b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719\": container with ID starting with b505a76915d613623cc641ee70e19a8b81ce8bd714926f21ad0eeff1ecf9c719 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736354 4745 scope.go:117] "RemoveContainer" containerID="e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736625 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055"} err="failed to get container status \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": rpc error: code = NotFound desc = could not find container \"e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055\": container with ID starting with e989b436b247a52752bae198e657ba321f190202b6efb2214c502323e39cb055 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.736645 4745 scope.go:117] "RemoveContainer" containerID="f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737061 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6"} err="failed to get container status \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": rpc error: code = NotFound desc = could not find container \"f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6\": container with ID starting with f03ef55fb4203421da5d041a73fd7d362d6f56b7fc2f5075853c98730793ecb6 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737090 4745 scope.go:117] "RemoveContainer" containerID="6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737424 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66"} err="failed to get container status \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": rpc error: code = NotFound desc = could not find container \"6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66\": container with ID starting with 6a1bf1461544ec2e6ca990790c86a410bfb3eee4593823f5ca5c6e47b3aa6f66 not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737446 4745 scope.go:117] "RemoveContainer" containerID="d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737804 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf"} err="failed to get container status \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": rpc error: code = NotFound desc = could not find container \"d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf\": container with ID starting with d7970af91caab76652c2c39a1508385760547ecdf223c62b753556d9478dd6cf not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.737821 4745 scope.go:117] "RemoveContainer" containerID="33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.738125 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d"} err="failed to get container status \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": rpc error: code = NotFound desc = could not find container \"33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d\": container with ID starting with 33b407061668c052e11314b25ceb36a725fe5b6cd77c2982e4ec42e926c7df3d not found: ID does not exist" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773039 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773112 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773148 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773152 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-systemd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773188 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773222 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773263 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773286 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-var-lib-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773297 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773360 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-log-socket\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773376 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-ovn\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773434 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-netns\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773389 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-run-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773597 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773658 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773762 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773768 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-netd\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773768 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-slash\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773827 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-run-ovn-kubernetes\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.773857 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774329 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-script-lib\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.774947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-kubelet\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775069 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775106 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775256 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-ovnkube-config\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775271 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775553 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-etc-openvswitch\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-host-cni-bin\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775586 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-systemd-units\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775605 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c5d25893-8bce-46da-9806-2fde750d93d0-node-log\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.775993 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c5d25893-8bce-46da-9806-2fde750d93d0-env-overrides\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.779168 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c5d25893-8bce-46da-9806-2fde750d93d0-ovn-node-metrics-cert\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.798187 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdrvb\" (UniqueName: \"kubernetes.io/projected/c5d25893-8bce-46da-9806-2fde750d93d0-kube-api-access-xdrvb\") pod \"ovnkube-node-64vfm\" (UID: \"c5d25893-8bce-46da-9806-2fde750d93d0\") " pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.855331 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.874054 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:19 crc kubenswrapper[4745]: I0319 00:19:19.878680 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-w2988"] Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.145211 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21835778-c889-4031-b630-586c00f200f9" path="/var/lib/kubelet/pods/21835778-c889-4031-b630-586c00f200f9/volumes" Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.536963 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540050 4745 generic.go:334] "Generic (PLEG): container finished" podID="c5d25893-8bce-46da-9806-2fde750d93d0" containerID="33ddc9ac56241f7dea8fc16946fde7802445ed2eda1d4a9745bd4ad269bdd58e" exitCode=0 Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540098 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerDied","Data":"33ddc9ac56241f7dea8fc16946fde7802445ed2eda1d4a9745bd4ad269bdd58e"} Mar 19 00:19:20 crc kubenswrapper[4745]: I0319 00:19:20.540128 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"b1c2f82361ff0cd2cae3b6f6fc64c915407d8c35cb4778248121cd1f76552426"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.550799 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"18e63ceecbccfa8df09defdfecaae77390bb55d5bda8680dba4b7dbeab634af3"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551272 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"402da4f74f043885d98933315cb68889e3d5f573bfe4e1da270e0f4df467536a"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551285 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"a58976e2141d0d0bf167da8229880b6e30f04b789dd285f4253a8f573f98e121"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"b0a673b932ca8a42026854f32cea1c4dc9c2c7a512f560e493085f95531dc135"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551306 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"4762d3a9f47e16b83eea535260293517472f549a8c14be6e95319819ab20e5bf"} Mar 19 00:19:21 crc kubenswrapper[4745]: I0319 00:19:21.551318 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"073eafa1e9157645b3f26f03b633f63ad809b9c7ae8bebb379a2f8f8310068f2"} Mar 19 00:19:24 crc kubenswrapper[4745]: I0319 00:19:24.573518 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"253414df0b83c02260ceae77d62dbdd5110b2c2b1345f5e00f05edfd31e4b1ae"} Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.587784 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" event={"ID":"c5d25893-8bce-46da-9806-2fde750d93d0","Type":"ContainerStarted","Data":"23bb1d1d3c3549c8bf7d4873d7650e19106674fdc6ed047085fd8ff8a4315b00"} Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588752 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588845 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.588941 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.619315 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.620706 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:19:26 crc kubenswrapper[4745]: I0319 00:19:26.629740 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" podStartSLOduration=7.629716723 podStartE2EDuration="7.629716723s" podCreationTimestamp="2026-03-19 00:19:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:19:26.62506819 +0000 UTC m=+731.163263331" watchObservedRunningTime="2026-03-19 00:19:26.629716723 +0000 UTC m=+731.167911854" Mar 19 00:19:31 crc kubenswrapper[4745]: I0319 00:19:31.138449 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:31 crc kubenswrapper[4745]: E0319 00:19:31.139155 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mlwp7_openshift-multus(6a0ae9c0-f19a-4038-be03-0fa6d223ebbf)\"" pod="openshift-multus/multus-mlwp7" podUID="6a0ae9c0-f19a-4038-be03-0fa6d223ebbf" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.138616 4745 scope.go:117] "RemoveContainer" containerID="24824f54f5d7906d7ff9e415522e7a824bf14a0a7fddbb60e7b205d77b6a0be8" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.705938 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mlwp7_6a0ae9c0-f19a-4038-be03-0fa6d223ebbf/kube-multus/2.log" Mar 19 00:19:44 crc kubenswrapper[4745]: I0319 00:19:44.706604 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mlwp7" event={"ID":"6a0ae9c0-f19a-4038-be03-0fa6d223ebbf","Type":"ContainerStarted","Data":"3e39454c14fb4170d640e610a3b7a97f6bb1459a50ea1fd4e65e6ac76d961ee5"} Mar 19 00:19:49 crc kubenswrapper[4745]: I0319 00:19:49.877699 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-64vfm" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.134114 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.135833 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.138333 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.138487 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.145667 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.148058 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.225196 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.325769 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.348366 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"auto-csr-approver-29564660-t5gfq\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.458055 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.684918 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:20:00 crc kubenswrapper[4745]: I0319 00:20:00.803249 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerStarted","Data":"e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6"} Mar 19 00:20:02 crc kubenswrapper[4745]: I0319 00:20:02.816053 4745 generic.go:334] "Generic (PLEG): container finished" podID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerID="1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2" exitCode=0 Mar 19 00:20:02 crc kubenswrapper[4745]: I0319 00:20:02.816132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerDied","Data":"1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2"} Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.053515 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.177611 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") pod \"559d4ca4-399c-4504-8358-69d88bfdaf3a\" (UID: \"559d4ca4-399c-4504-8358-69d88bfdaf3a\") " Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.183677 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr" (OuterVolumeSpecName: "kube-api-access-npkzr") pod "559d4ca4-399c-4504-8358-69d88bfdaf3a" (UID: "559d4ca4-399c-4504-8358-69d88bfdaf3a"). InnerVolumeSpecName "kube-api-access-npkzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.279066 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npkzr\" (UniqueName: \"kubernetes.io/projected/559d4ca4-399c-4504-8358-69d88bfdaf3a-kube-api-access-npkzr\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829501 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" event={"ID":"559d4ca4-399c-4504-8358-69d88bfdaf3a","Type":"ContainerDied","Data":"e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6"} Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829568 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1365df5e61031fa7137d4e234b2937f4b37148d8b6ecce9356c1b4bd9b25ca6" Mar 19 00:20:04 crc kubenswrapper[4745]: I0319 00:20:04.829570 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564660-t5gfq" Mar 19 00:20:05 crc kubenswrapper[4745]: I0319 00:20:05.117580 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:20:05 crc kubenswrapper[4745]: I0319 00:20:05.123207 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564654-j2b7b"] Mar 19 00:20:06 crc kubenswrapper[4745]: I0319 00:20:06.145610 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7689b2b-3fcb-4122-bb50-fb8215cdb08b" path="/var/lib/kubelet/pods/f7689b2b-3fcb-4122-bb50-fb8215cdb08b/volumes" Mar 19 00:20:15 crc kubenswrapper[4745]: I0319 00:20:15.606730 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:20:15 crc kubenswrapper[4745]: I0319 00:20:15.608732 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.411840 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.412948 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j587v" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" containerID="cri-o://60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" gracePeriod=30 Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.771129 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.891679 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892580 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892780 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") pod \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\" (UID: \"0dcd96ee-b500-4027-8a29-f0d6f59ea06b\") " Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.892788 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities" (OuterVolumeSpecName: "utilities") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.902249 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8" (OuterVolumeSpecName: "kube-api-access-fmdx8") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "kube-api-access-fmdx8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.922741 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0dcd96ee-b500-4027-8a29-f0d6f59ea06b" (UID: "0dcd96ee-b500-4027-8a29-f0d6f59ea06b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.978800 4745 generic.go:334] "Generic (PLEG): container finished" podID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" exitCode=0 Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979136 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j587v" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.978896 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979439 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j587v" event={"ID":"0dcd96ee-b500-4027-8a29-f0d6f59ea06b","Type":"ContainerDied","Data":"5e90ccfe8369ff143297fb00f49862c706dc2eb6a69c3dc1f5670ef331a15a02"} Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.979558 4745 scope.go:117] "RemoveContainer" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.993873 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.994170 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmdx8\" (UniqueName: \"kubernetes.io/projected/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-kube-api-access-fmdx8\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:24 crc kubenswrapper[4745]: I0319 00:20:24.994259 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0dcd96ee-b500-4027-8a29-f0d6f59ea06b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.003454 4745 scope.go:117] "RemoveContainer" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.031812 4745 scope.go:117] "RemoveContainer" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.034212 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.040296 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j587v"] Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062004 4745 scope.go:117] "RemoveContainer" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.062749 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": container with ID starting with 60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a not found: ID does not exist" containerID="60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062861 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a"} err="failed to get container status \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": rpc error: code = NotFound desc = could not find container \"60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a\": container with ID starting with 60cbc3a88c1cf03f9c03421ef3083627231649ea2779d0d325ffa79d2c07128a not found: ID does not exist" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.062957 4745 scope.go:117] "RemoveContainer" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.063524 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": container with ID starting with a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c not found: ID does not exist" containerID="a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.063633 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c"} err="failed to get container status \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": rpc error: code = NotFound desc = could not find container \"a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c\": container with ID starting with a3aebdc0bc2f3f37d96f1b34b6711d65e38c74a57f674efb68f8a10a4b94af8c not found: ID does not exist" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.063732 4745 scope.go:117] "RemoveContainer" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: E0319 00:20:25.064232 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": container with ID starting with d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae not found: ID does not exist" containerID="d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae" Mar 19 00:20:25 crc kubenswrapper[4745]: I0319 00:20:25.064327 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae"} err="failed to get container status \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": rpc error: code = NotFound desc = could not find container \"d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae\": container with ID starting with d1d5b5c9823a65f928be8bfacdf649608867bb26e8a829cbd3fb5f8baa7cd5ae not found: ID does not exist" Mar 19 00:20:26 crc kubenswrapper[4745]: I0319 00:20:26.145168 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" path="/var/lib/kubelet/pods/0dcd96ee-b500-4027-8a29-f0d6f59ea06b/volumes" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.506314 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.506991 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507006 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507021 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-utilities" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507028 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-utilities" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507038 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507047 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: E0319 00:20:28.507070 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-content" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507075 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="extract-content" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507173 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" containerName="oc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507182 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dcd96ee-b500-4027-8a29-f0d6f59ea06b" containerName="registry-server" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.507952 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.511684 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.520465 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547703 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547761 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.547869 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651527 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.651614 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.652215 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.652615 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.679353 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:28 crc kubenswrapper[4745]: I0319 00:20:28.861027 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:29 crc kubenswrapper[4745]: I0319 00:20:29.064584 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d"] Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014195 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="0465dcfd21fde34a50f161d1cde6579abf629fb152d29ad2a3fdb5dacb7957d2" exitCode=0 Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014271 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"0465dcfd21fde34a50f161d1cde6579abf629fb152d29ad2a3fdb5dacb7957d2"} Mar 19 00:20:30 crc kubenswrapper[4745]: I0319 00:20:30.014705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerStarted","Data":"2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a"} Mar 19 00:20:32 crc kubenswrapper[4745]: I0319 00:20:32.033112 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="ef1ea707f8bf9df1e84b40963515384be179344be9dd602a8d23878a7e5524b6" exitCode=0 Mar 19 00:20:32 crc kubenswrapper[4745]: I0319 00:20:32.033216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"ef1ea707f8bf9df1e84b40963515384be179344be9dd602a8d23878a7e5524b6"} Mar 19 00:20:33 crc kubenswrapper[4745]: I0319 00:20:33.042315 4745 generic.go:334] "Generic (PLEG): container finished" podID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerID="f5380b52bacf8a1042f9633d7a52226818469c07b6fd057be0b678d8b9909f81" exitCode=0 Mar 19 00:20:33 crc kubenswrapper[4745]: I0319 00:20:33.042374 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"f5380b52bacf8a1042f9633d7a52226818469c07b6fd057be0b678d8b9909f81"} Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.304561 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.306382 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.319968 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340832 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.340999 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.397142 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.441921 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442167 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") pod \"6e2c290f-c398-4c6e-9dec-82038e0bda08\" (UID: \"6e2c290f-c398-4c6e-9dec-82038e0bda08\") " Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442413 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442464 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.442498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.443257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.443248 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.446009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle" (OuterVolumeSpecName: "bundle") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.450601 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9" (OuterVolumeSpecName: "kube-api-access-cxct9") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "kube-api-access-cxct9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.466007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.469982 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util" (OuterVolumeSpecName: "util") pod "6e2c290f-c398-4c6e-9dec-82038e0bda08" (UID: "6e2c290f-c398-4c6e-9dec-82038e0bda08"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543780 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543827 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cxct9\" (UniqueName: \"kubernetes.io/projected/6e2c290f-c398-4c6e-9dec-82038e0bda08-kube-api-access-cxct9\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.543839 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6e2c290f-c398-4c6e-9dec-82038e0bda08-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.632252 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.760194 4745 scope.go:117] "RemoveContainer" containerID="d0b7bf2fb29c7b89c86195effcef47a72d6e88d2457a53a6804ca521616f6ee6" Mar 19 00:20:34 crc kubenswrapper[4745]: I0319 00:20:34.858296 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892"] Mar 19 00:20:34 crc kubenswrapper[4745]: W0319 00:20:34.867811 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8baea7f9_4007_48cc_a849_7b8ce10c526b.slice/crio-c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90 WatchSource:0}: Error finding container c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90: Status 404 returned error can't find the container with id c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90 Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.058354 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerStarted","Data":"c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.058684 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerStarted","Data":"c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061809 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" event={"ID":"6e2c290f-c398-4c6e-9dec-82038e0bda08","Type":"ContainerDied","Data":"2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a"} Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061837 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2223840d46c198dcf1ff6d4a9faf641dfd5931970f8a283d0f4aab34a9da5a6a" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.061947 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.313584 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.313963 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="util" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.313983 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="util" Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.314009 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314016 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: E0319 00:20:35.314031 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="pull" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314040 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="pull" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.314136 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e2c290f-c398-4c6e-9dec-82038e0bda08" containerName="extract" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.319850 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.326001 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355070 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355207 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.355243 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456517 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.456616 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.457437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.457624 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.480028 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.635422 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:35 crc kubenswrapper[4745]: I0319 00:20:35.887963 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2"] Mar 19 00:20:35 crc kubenswrapper[4745]: W0319 00:20:35.896033 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ef9c16b_de5e_456d_899c_15bdcfba6c89.slice/crio-d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113 WatchSource:0}: Error finding container d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113: Status 404 returned error can't find the container with id d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113 Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.068801 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195" exitCode=0 Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.068937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"c1a0807a6be939d584d1b6f01c689a15037cfa9127acd29591944dcc243a8195"} Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.071698 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerStarted","Data":"87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe"} Mar 19 00:20:36 crc kubenswrapper[4745]: I0319 00:20:36.071749 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerStarted","Data":"d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113"} Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.081176 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe" exitCode=0 Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.081298 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"87bb193744374a4892e61e2db169e197782d1eba0014818900e955f51aefbdfe"} Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.085264 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="74c1fd116e1be56ae0ca1d679824c6061d578cdad159dd7bdee7cf2704e0e3f4" exitCode=0 Mar 19 00:20:37 crc kubenswrapper[4745]: I0319 00:20:37.085298 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"74c1fd116e1be56ae0ca1d679824c6061d578cdad159dd7bdee7cf2704e0e3f4"} Mar 19 00:20:38 crc kubenswrapper[4745]: I0319 00:20:38.093707 4745 generic.go:334] "Generic (PLEG): container finished" podID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerID="6eb0b771b13168857799256a37aec560362ff2f87f67686d96e809d87ee9de8a" exitCode=0 Mar 19 00:20:38 crc kubenswrapper[4745]: I0319 00:20:38.093826 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"6eb0b771b13168857799256a37aec560362ff2f87f67686d96e809d87ee9de8a"} Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.511745 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633415 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633555 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.633585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") pod \"8baea7f9-4007-48cc-a849-7b8ce10c526b\" (UID: \"8baea7f9-4007-48cc-a849-7b8ce10c526b\") " Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.634508 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle" (OuterVolumeSpecName: "bundle") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.640858 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm" (OuterVolumeSpecName: "kube-api-access-5fvzm") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "kube-api-access-5fvzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.653826 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util" (OuterVolumeSpecName: "util") pod "8baea7f9-4007-48cc-a849-7b8ce10c526b" (UID: "8baea7f9-4007-48cc-a849-7b8ce10c526b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735871 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735933 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fvzm\" (UniqueName: \"kubernetes.io/projected/8baea7f9-4007-48cc-a849-7b8ce10c526b-kube-api-access-5fvzm\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:39 crc kubenswrapper[4745]: I0319 00:20:39.735947 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8baea7f9-4007-48cc-a849-7b8ce10c526b-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205640 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" event={"ID":"8baea7f9-4007-48cc-a849-7b8ce10c526b","Type":"ContainerDied","Data":"c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90"} Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205704 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2cdb0635f7733f7f9b82cfbee3210120b13039355d2311046b243f6b900dc90" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.205695 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892" Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.207841 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="320635c6f3f06d3534da7620d1c32e51b07e2f3cfb8d016e53c4d638a8ff1591" exitCode=0 Mar 19 00:20:40 crc kubenswrapper[4745]: I0319 00:20:40.207915 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"320635c6f3f06d3534da7620d1c32e51b07e2f3cfb8d016e53c4d638a8ff1591"} Mar 19 00:20:41 crc kubenswrapper[4745]: I0319 00:20:41.216431 4745 generic.go:334] "Generic (PLEG): container finished" podID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerID="cc6d2564161abb289a0cdfc21878f1353ec0ea6ab461cff0ac11a657e6894e3e" exitCode=0 Mar 19 00:20:41 crc kubenswrapper[4745]: I0319 00:20:41.216530 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"cc6d2564161abb289a0cdfc21878f1353ec0ea6ab461cff0ac11a657e6894e3e"} Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.063491 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091641 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091911 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091925 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091937 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091944 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091958 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091966 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091976 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.091982 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.091993 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092000 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="pull" Mar 19 00:20:43 crc kubenswrapper[4745]: E0319 00:20:43.092014 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092021 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="util" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092107 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baea7f9-4007-48cc-a849-7b8ce10c526b" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092117 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ef9c16b-de5e-456d-899c-15bdcfba6c89" containerName="extract" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.092963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.109924 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253617 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253762 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.253821 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") pod \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\" (UID: \"9ef9c16b-de5e-456d-899c-15bdcfba6c89\") " Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254078 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.254666 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.255598 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle" (OuterVolumeSpecName: "bundle") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.272263 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util" (OuterVolumeSpecName: "util") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.272507 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz" (OuterVolumeSpecName: "kube-api-access-jfxlz") pod "9ef9c16b-de5e-456d-899c-15bdcfba6c89" (UID: "9ef9c16b-de5e-456d-899c-15bdcfba6c89"). InnerVolumeSpecName "kube-api-access-jfxlz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278825 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" event={"ID":"9ef9c16b-de5e-456d-899c-15bdcfba6c89","Type":"ContainerDied","Data":"d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113"} Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278872 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d2395403789e88ce0a8f9d4fe105492924c1349886630b850d4425403bb46113" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.278968 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356079 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356135 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356298 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356311 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfxlz\" (UniqueName: \"kubernetes.io/projected/9ef9c16b-de5e-456d-899c-15bdcfba6c89-kube-api-access-jfxlz\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356321 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9ef9c16b-de5e-456d-899c-15bdcfba6c89-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.356934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.357002 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.396367 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:43 crc kubenswrapper[4745]: I0319 00:20:43.407941 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.105859 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l"] Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.301700 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerStarted","Data":"3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016"} Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.606169 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:20:45 crc kubenswrapper[4745]: I0319 00:20:45.606903 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:20:46 crc kubenswrapper[4745]: I0319 00:20:46.308863 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="187af0133ef427170b8c6022e9a4d4d260d309d247b481838c77af900f2c83b6" exitCode=0 Mar 19 00:20:46 crc kubenswrapper[4745]: I0319 00:20:46.308919 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"187af0133ef427170b8c6022e9a4d4d260d309d247b481838c77af900f2c83b6"} Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.751173 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.752230 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.757649 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.757548 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.759434 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-zppsp" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.828092 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.897232 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:47 crc kubenswrapper[4745]: I0319 00:20:47.998498 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.035664 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6zt6\" (UniqueName: \"kubernetes.io/projected/bcf530c8-afe8-4a0e-9e5c-bfd85712e37a-kube-api-access-c6zt6\") pod \"obo-prometheus-operator-8ff7d675-tz2rm\" (UID: \"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.068712 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.368926 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.376242 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.384912 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-fbxzm" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.385825 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.396963 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.398755 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.399862 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.461580 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508303 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508393 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508452 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.508474 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.580135 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm"] Mar 19 00:20:48 crc kubenswrapper[4745]: W0319 00:20:48.599232 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcf530c8_afe8_4a0e_9e5c_bfd85712e37a.slice/crio-39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1 WatchSource:0}: Error finding container 39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1: Status 404 returned error can't find the container with id 39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1 Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.614998 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615060 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615101 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.615146 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.623272 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.628270 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.628859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/69d14850-5c50-4c06-8581-2a70644c7de7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78\" (UID: \"69d14850-5c50-4c06-8581-2a70644c7de7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.630521 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/f5c4fe84-51cb-479a-a8cc-2e07bde21417-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2\" (UID: \"f5c4fe84-51cb-479a-a8cc-2e07bde21417\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.718825 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.726236 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.792537 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.793374 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.795806 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-tdc6z" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.798348 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.852277 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.931813 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:48 crc kubenswrapper[4745]: I0319 00:20:48.932388 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.034173 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.034539 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.040041 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.073764 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsg5k\" (UniqueName: \"kubernetes.io/projected/a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534-kube-api-access-fsg5k\") pod \"observability-operator-6dd7dd855f-shlz7\" (UID: \"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534\") " pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.119550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.375299 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" event={"ID":"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a","Type":"ContainerStarted","Data":"39c5408e3451690ad2f74d4ac71ce5fc5253d0be296d6c14aacd956916b99bd1"} Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.440720 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.449836 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.454785 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.455148 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-zf5z7" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.489578 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.504319 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551291 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551350 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551384 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.551425 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.635361 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2"] Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654629 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654718 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.654789 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.656831 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/58d05d23-3632-4b84-94f8-1db548b90a03-openshift-service-ca\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.670685 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-webhook-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: W0319 00:20:49.672318 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c4fe84_51cb_479a_a8cc_2e07bde21417.slice/crio-7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386 WatchSource:0}: Error finding container 7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386: Status 404 returned error can't find the container with id 7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386 Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.683720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/58d05d23-3632-4b84-94f8-1db548b90a03-apiservice-cert\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.684478 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfgn6\" (UniqueName: \"kubernetes.io/projected/58d05d23-3632-4b84-94f8-1db548b90a03-kube-api-access-bfgn6\") pod \"perses-operator-78548ff687-rjvkn\" (UID: \"58d05d23-3632-4b84-94f8-1db548b90a03\") " pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.786408 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:20:49 crc kubenswrapper[4745]: I0319 00:20:49.854739 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-shlz7"] Mar 19 00:20:49 crc kubenswrapper[4745]: W0319 00:20:49.882861 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7c2afa4_a0d4_4aad_b6ad_31b7cb4c9534.slice/crio-c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb WatchSource:0}: Error finding container c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb: Status 404 returned error can't find the container with id c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.104981 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-78548ff687-rjvkn"] Mar 19 00:20:50 crc kubenswrapper[4745]: W0319 00:20:50.139312 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58d05d23_3632_4b84_94f8_1db548b90a03.slice/crio-03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc WatchSource:0}: Error finding container 03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc: Status 404 returned error can't find the container with id 03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.403495 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" event={"ID":"f5c4fe84-51cb-479a-a8cc-2e07bde21417","Type":"ContainerStarted","Data":"7cb52939b4a9f8de254a9ba4126ceeedf95c42bf8524807229ce7b9648861386"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.404961 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" event={"ID":"69d14850-5c50-4c06-8581-2a70644c7de7","Type":"ContainerStarted","Data":"1ae45c16e4eb0987ccf1106b6425fc706c42ab5acb437b54bf7fa9c7f187c3ab"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.406556 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" event={"ID":"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534","Type":"ContainerStarted","Data":"c59adb8ac4c17681a5c895b4f642b81a9b8961395ec6d146a3f313f2e26400bb"} Mar 19 00:20:50 crc kubenswrapper[4745]: I0319 00:20:50.408639 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-78548ff687-rjvkn" event={"ID":"58d05d23-3632-4b84-94f8-1db548b90a03","Type":"ContainerStarted","Data":"03edfc4bbcfc08847196ab41d37cd2e71029e4e8306f3387489c0d0b5e64e6dc"} Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.257838 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.259352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.262321 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-pwn48" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.262582 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.266352 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.363444 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.465086 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.492388 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpkxz\" (UniqueName: \"kubernetes.io/projected/9f239a73-6d47-4e9b-a74e-f97757ec8e4f-kube-api-access-wpkxz\") pod \"interconnect-operator-5bb49f789d-f6stw\" (UID: \"9f239a73-6d47-4e9b-a74e-f97757ec8e4f\") " pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.537444 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:20:53 crc kubenswrapper[4745]: I0319 00:20:53.586550 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.904039 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.906712 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.910201 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.911857 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-qp99d" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.923218 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948046 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948125 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:55 crc kubenswrapper[4745]: I0319 00:20:55.948217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049198 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049273 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.049305 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.056455 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-apiservice-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.068741 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5ba244a1-0d6a-4cab-84c4-51501f3c7916-webhook-cert\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.096691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6qtj\" (UniqueName: \"kubernetes.io/projected/5ba244a1-0d6a-4cab-84c4-51501f3c7916-kube-api-access-c6qtj\") pod \"elastic-operator-64749dc85d-gzlt5\" (UID: \"5ba244a1-0d6a-4cab-84c4-51501f3c7916\") " pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:56 crc kubenswrapper[4745]: I0319 00:20:56.227583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" Mar 19 00:20:57 crc kubenswrapper[4745]: I0319 00:20:57.895928 4745 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.842976 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.843690 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a,Command:[],Args:[--prometheus-config-reloader=$(RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER) --prometheus-instance-selector=app.kubernetes.io/managed-by=observability-operator --alertmanager-instance-selector=app.kubernetes.io/managed-by=observability-operator --thanos-ruler-instance-selector=app.kubernetes.io/managed-by=observability-operator --watch-referenced-objects-in-all-namespaces=true --disable-unmanaged-prometheus-configuration=true],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:GOGC,Value:30,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_PROMETHEUS_CONFIG_RELOADER,Value:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-prometheus-config-reloader-rhel9@sha256:e4412f5688c9725f36d2f566f624d82a1a2a5b957686245fd2defcc39604bdc2,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{157286400 0} {} 150Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c6zt6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-8ff7d675-tz2rm_openshift-operators(bcf530c8-afe8-4a0e-9e5c-bfd85712e37a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.844866 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podUID="bcf530c8-afe8-4a0e-9e5c-bfd85712e37a" Mar 19 00:21:04 crc kubenswrapper[4745]: E0319 00:21:04.966027 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-rhel9-operator@sha256:161082f81c8c77471a421b3b4bcb8a47ca64aa08a5dd1abf27e7f2f964b35a2a\\\"\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podUID="bcf530c8-afe8-4a0e-9e5c-bfd85712e37a" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.317174 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.317417 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_openshift-operators(f5c4fe84-51cb-479a-a8cc-2e07bde21417): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.318578 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podUID="f5c4fe84-51cb-479a-a8cc-2e07bde21417" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.362840 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.363072 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.4.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_openshift-operators(69d14850-5c50-4c06-8581-2a70644c7de7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.364202 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podUID="69d14850-5c50-4c06-8581-2a70644c7de7" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.971857 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podUID="f5c4fe84-51cb-479a-a8cc-2e07bde21417" Mar 19 00:21:05 crc kubenswrapper[4745]: E0319 00:21:05.972053 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:fb1030480e5a55ead0d9748615a2e4b9228522f14b77a782f44407883c24ba93\\\"\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podUID="69d14850-5c50-4c06-8581-2a70644c7de7" Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.645137 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-64749dc85d-gzlt5"] Mar 19 00:21:09 crc kubenswrapper[4745]: W0319 00:21:09.662100 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ba244a1_0d6a_4cab_84c4_51501f3c7916.slice/crio-1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630 WatchSource:0}: Error finding container 1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630: Status 404 returned error can't find the container with id 1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630 Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.664781 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:21:09 crc kubenswrapper[4745]: I0319 00:21:09.680192 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-f6stw"] Mar 19 00:21:09 crc kubenswrapper[4745]: W0319 00:21:09.692579 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f239a73_6d47_4e9b_a74e_f97757ec8e4f.slice/crio-31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf WatchSource:0}: Error finding container 31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf: Status 404 returned error can't find the container with id 31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.083912 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" event={"ID":"a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534","Type":"ContainerStarted","Data":"a40e618f9855ef6ca3e897be6a0a37a8d7e52ce962bccc20fcf303fb42ceddcb"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.085535 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.087401 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.088559 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="0a0bc5818b6461e81b5a0781f254c9da8578165e3c8d81b051eb92f1dadd585e" exitCode=0 Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.088640 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"0a0bc5818b6461e81b5a0781f254c9da8578165e3c8d81b051eb92f1dadd585e"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.092334 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-78548ff687-rjvkn" event={"ID":"58d05d23-3632-4b84-94f8-1db548b90a03","Type":"ContainerStarted","Data":"be55ef8f11a080e93ad68459f15d472dd4f7bd48a72237eaeb2ed8e11dd72157"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.092481 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.093788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" event={"ID":"5ba244a1-0d6a-4cab-84c4-51501f3c7916","Type":"ContainerStarted","Data":"1a016190fd3fe28d0c775889b96ccf640bffe43c012b7e28c066fb90024e3630"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.094819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" event={"ID":"9f239a73-6d47-4e9b-a74e-f97757ec8e4f","Type":"ContainerStarted","Data":"31da37debf7775a6ca4da7f0df9ec13007ee1ad1534271277c01dca91211ffdf"} Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.588094 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-shlz7" podStartSLOduration=3.6359071849999998 podStartE2EDuration="22.588072634s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.902964667 +0000 UTC m=+814.441159798" lastFinishedPulling="2026-03-19 00:21:08.855130116 +0000 UTC m=+833.393325247" observedRunningTime="2026-03-19 00:21:10.582922104 +0000 UTC m=+835.121117245" watchObservedRunningTime="2026-03-19 00:21:10.588072634 +0000 UTC m=+835.126267765" Mar 19 00:21:10 crc kubenswrapper[4745]: I0319 00:21:10.777331 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-78548ff687-rjvkn" podStartSLOduration=3.08036264 podStartE2EDuration="21.777308206s" podCreationTimestamp="2026-03-19 00:20:49 +0000 UTC" firstStartedPulling="2026-03-19 00:20:50.151789123 +0000 UTC m=+814.689984254" lastFinishedPulling="2026-03-19 00:21:08.848734689 +0000 UTC m=+833.386929820" observedRunningTime="2026-03-19 00:21:10.678428133 +0000 UTC m=+835.216623264" watchObservedRunningTime="2026-03-19 00:21:10.777308206 +0000 UTC m=+835.315503337" Mar 19 00:21:11 crc kubenswrapper[4745]: I0319 00:21:11.156562 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerStarted","Data":"7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942"} Mar 19 00:21:11 crc kubenswrapper[4745]: I0319 00:21:11.216275 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" podStartSLOduration=5.748639025 podStartE2EDuration="28.216244664s" podCreationTimestamp="2026-03-19 00:20:43 +0000 UTC" firstStartedPulling="2026-03-19 00:20:46.310738304 +0000 UTC m=+810.848933435" lastFinishedPulling="2026-03-19 00:21:08.778343943 +0000 UTC m=+833.316539074" observedRunningTime="2026-03-19 00:21:11.213292303 +0000 UTC m=+835.751487464" watchObservedRunningTime="2026-03-19 00:21:11.216244664 +0000 UTC m=+835.754439795" Mar 19 00:21:12 crc kubenswrapper[4745]: I0319 00:21:12.167603 4745 generic.go:334] "Generic (PLEG): container finished" podID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerID="7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942" exitCode=0 Mar 19 00:21:12 crc kubenswrapper[4745]: I0319 00:21:12.168577 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"7c68b3f23d960dc40d0683e2267b0f117a83d96c74d92e09b8cd76b3b2b28942"} Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.489090 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.505732 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.505854 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.506046 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") pod \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\" (UID: \"4ef969f2-d76b-405e-baaf-c10a36d36ed3\") " Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.508325 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle" (OuterVolumeSpecName: "bundle") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.514211 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l" (OuterVolumeSpecName: "kube-api-access-m4z5l") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "kube-api-access-m4z5l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.527974 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util" (OuterVolumeSpecName: "util") pod "4ef969f2-d76b-405e-baaf-c10a36d36ed3" (UID: "4ef969f2-d76b-405e-baaf-c10a36d36ed3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607034 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607074 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4ef969f2-d76b-405e-baaf-c10a36d36ed3-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:13 crc kubenswrapper[4745]: I0319 00:21:13.607089 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4z5l\" (UniqueName: \"kubernetes.io/projected/4ef969f2-d76b-405e-baaf-c10a36d36ed3-kube-api-access-m4z5l\") on node \"crc\" DevicePath \"\"" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188413 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" event={"ID":"4ef969f2-d76b-405e-baaf-c10a36d36ed3","Type":"ContainerDied","Data":"3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016"} Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188479 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ef7a137321f30be497e8cc61b9023655954715f64b2bf473455bf5c69e5a016" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.188552 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642571 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="pull" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642586 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="pull" Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642596 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="util" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642603 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="util" Mar 19 00:21:14 crc kubenswrapper[4745]: E0319 00:21:14.642619 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642627 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.642728 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef969f2-d76b-405e-baaf-c10a36d36ed3" containerName="extract" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.643649 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.714403 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722543 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722610 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.722649 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824105 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824160 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.824625 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:14 crc kubenswrapper[4745]: I0319 00:21:14.825021 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.186201 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"redhat-operators-mtlwq\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.260561 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605591 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605657 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.605706 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.606650 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:21:15 crc kubenswrapper[4745]: I0319 00:21:15.606723 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" gracePeriod=600 Mar 19 00:21:15 crc kubenswrapper[4745]: E0319 00:21:15.746056 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-conmon-c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod400972f4_050f_4f26_b982_ced6f2590c8b.slice/crio-c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018.scope\": RecentStats: unable to find data in memory cache]" Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252123 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" exitCode=0 Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252689 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018"} Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.252750 4745 scope.go:117] "RemoveContainer" containerID="051b71cd1adf671f1f1535a2cdc76bfdb90671964e234784e73bcde9b37bd06d" Mar 19 00:21:16 crc kubenswrapper[4745]: I0319 00:21:16.778763 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.265397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" event={"ID":"5ba244a1-0d6a-4cab-84c4-51501f3c7916","Type":"ContainerStarted","Data":"a2d2bb0f1656421fad969dd9938608d7c1f3a0c6742aaaefc4088efa505ad3db"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.297138 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.310093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f"} Mar 19 00:21:17 crc kubenswrapper[4745]: I0319 00:21:17.343196 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-64749dc85d-gzlt5" podStartSLOduration=15.756627829 podStartE2EDuration="22.343177996s" podCreationTimestamp="2026-03-19 00:20:55 +0000 UTC" firstStartedPulling="2026-03-19 00:21:09.664535923 +0000 UTC m=+834.202731054" lastFinishedPulling="2026-03-19 00:21:16.25108609 +0000 UTC m=+840.789281221" observedRunningTime="2026-03-19 00:21:17.310904773 +0000 UTC m=+841.849099904" watchObservedRunningTime="2026-03-19 00:21:17.343177996 +0000 UTC m=+841.881373127" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.350352 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d" exitCode=0 Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.350550 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d"} Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.357333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" event={"ID":"bcf530c8-afe8-4a0e-9e5c-bfd85712e37a","Type":"ContainerStarted","Data":"5e7bbf77fd0253e4cc85b5b6317591552343f2a1875ac824d63bdc7e790b1814"} Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.422277 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-tz2rm" podStartSLOduration=3.457659316 podStartE2EDuration="31.422258392s" podCreationTimestamp="2026-03-19 00:20:47 +0000 UTC" firstStartedPulling="2026-03-19 00:20:48.603006074 +0000 UTC m=+813.141201205" lastFinishedPulling="2026-03-19 00:21:16.56760515 +0000 UTC m=+841.105800281" observedRunningTime="2026-03-19 00:21:18.419305491 +0000 UTC m=+842.957500642" watchObservedRunningTime="2026-03-19 00:21:18.422258392 +0000 UTC m=+842.960453523" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.583852 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.585045 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594142 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594616 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594666 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-p9t9n" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594159 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594253 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594339 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594420 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594494 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.594567 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.617837 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733029 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733091 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733126 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733146 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733166 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733196 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733223 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733262 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733306 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733323 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733340 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733361 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.733387 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835280 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835383 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835414 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835438 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835472 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835521 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835538 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835588 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835610 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835641 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835695 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.835728 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836016 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836541 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.836535 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.844456 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.845320 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847267 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847454 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847695 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.847844 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848294 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/884040c3-6c56-45b0-881d-e73f52c0ab34-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848347 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.848798 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.855173 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.871616 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.883637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/884040c3-6c56-45b0-881d-e73f52c0ab34-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"884040c3-6c56-45b0-881d-e73f52c0ab34\") " pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:18 crc kubenswrapper[4745]: I0319 00:21:18.907289 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:21:19 crc kubenswrapper[4745]: I0319 00:21:19.790461 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-78548ff687-rjvkn" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.938734 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.940702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.944358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.945169 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-v22wp" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.946397 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 19 00:21:27 crc kubenswrapper[4745]: I0319 00:21:27.966492 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.110781 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.110916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.531590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.531705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.533133 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1340e1bb-a8aa-4a0c-b295-f49f94e81055-tmp\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.594932 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ltd\" (UniqueName: \"kubernetes.io/projected/1340e1bb-a8aa-4a0c-b295-f49f94e81055-kube-api-access-q5ltd\") pod \"cert-manager-operator-controller-manager-5586865c96-2hk9g\" (UID: \"1340e1bb-a8aa-4a0c-b295-f49f94e81055\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:28 crc kubenswrapper[4745]: I0319 00:21:28.870642 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.485986 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.486754 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:interconnect-operator,Image:registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43,Command:[qdr-operator],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:60000,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:qdr-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_QDROUTERD_IMAGE,Value:registry.redhat.io/amq7/amq-interconnect@sha256:31d87473fa684178a694f9ee331d3c80f2653f9533cb65c2a325752166a077e9,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:amq7-interconnect-operator.v1.10.20,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wpkxz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod interconnect-operator-5bb49f789d-f6stw_service-telemetry(9f239a73-6d47-4e9b-a74e-f97757ec8e4f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.488228 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podUID="9f239a73-6d47-4e9b-a74e-f97757ec8e4f" Mar 19 00:21:32 crc kubenswrapper[4745]: E0319 00:21:32.903004 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"interconnect-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/amq7/amq-interconnect-operator@sha256:a8b621237c872ded2a1d1d948fbebd693429e4a1ced1d7922406241a078d3d43\\\"\"" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podUID="9f239a73-6d47-4e9b-a74e-f97757ec8e4f" Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.213684 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.222744 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g"] Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.926213 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"d7bb1db6b8b2ffad83723d95e989fc91204d27db1c2888598e44eba176edc90c"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.928427 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" event={"ID":"f5c4fe84-51cb-479a-a8cc-2e07bde21417","Type":"ContainerStarted","Data":"67b90f26a2cc7c0a5914348ece6b1670b8a3005382d0d4bc27217d72c3f1d8fc"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.931683 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" event={"ID":"69d14850-5c50-4c06-8581-2a70644c7de7","Type":"ContainerStarted","Data":"08fe4f272cd75074db9f6367a8948a99b41f1f15b9a54e731cc1ffcbb7774578"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.934453 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb"} Mar 19 00:21:33 crc kubenswrapper[4745]: I0319 00:21:33.936146 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" event={"ID":"1340e1bb-a8aa-4a0c-b295-f49f94e81055","Type":"ContainerStarted","Data":"759b9734b100c09547930afd66df15fff628cb84b25b4c0fbdd4b1ecd7514cd2"} Mar 19 00:21:34 crc kubenswrapper[4745]: I0319 00:21:34.044779 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2" podStartSLOduration=3.153676139 podStartE2EDuration="46.044747697s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.686324651 +0000 UTC m=+814.224519772" lastFinishedPulling="2026-03-19 00:21:32.577396199 +0000 UTC m=+857.115591330" observedRunningTime="2026-03-19 00:21:33.994863091 +0000 UTC m=+858.533058232" watchObservedRunningTime="2026-03-19 00:21:34.044747697 +0000 UTC m=+858.582942838" Mar 19 00:21:34 crc kubenswrapper[4745]: I0319 00:21:34.046417 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78" podStartSLOduration=3.006949643 podStartE2EDuration="46.046406688s" podCreationTimestamp="2026-03-19 00:20:48 +0000 UTC" firstStartedPulling="2026-03-19 00:20:49.539945216 +0000 UTC m=+814.078140347" lastFinishedPulling="2026-03-19 00:21:32.579402261 +0000 UTC m=+857.117597392" observedRunningTime="2026-03-19 00:21:34.036332084 +0000 UTC m=+858.574527245" watchObservedRunningTime="2026-03-19 00:21:34.046406688 +0000 UTC m=+858.584601819" Mar 19 00:21:37 crc kubenswrapper[4745]: I0319 00:21:37.037356 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb" exitCode=0 Mar 19 00:21:37 crc kubenswrapper[4745]: I0319 00:21:37.038192 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb"} Mar 19 00:21:38 crc kubenswrapper[4745]: I0319 00:21:38.083996 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerStarted","Data":"857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1"} Mar 19 00:21:38 crc kubenswrapper[4745]: I0319 00:21:38.113620 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mtlwq" podStartSLOduration=14.205200094 podStartE2EDuration="24.113588986s" podCreationTimestamp="2026-03-19 00:21:14 +0000 UTC" firstStartedPulling="2026-03-19 00:21:27.739141115 +0000 UTC m=+852.277336256" lastFinishedPulling="2026-03-19 00:21:37.647530017 +0000 UTC m=+862.185725148" observedRunningTime="2026-03-19 00:21:38.110125409 +0000 UTC m=+862.648320550" watchObservedRunningTime="2026-03-19 00:21:38.113588986 +0000 UTC m=+862.651784117" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.708565 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.710199 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716460 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716627 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716709 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.716767 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.773215 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821011 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821058 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821656 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821734 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.821972 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822005 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822060 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822126 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822211 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822238 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822272 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.822335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.923930 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924031 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924084 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924113 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924136 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924159 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924184 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924210 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924238 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924270 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.924339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.925343 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.925857 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.926072 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.926605 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.927449 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928198 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928364 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.928819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.934647 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.948021 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:44 crc kubenswrapper[4745]: I0319 00:21:44.955941 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-1-build\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.030017 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.261286 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:45 crc kubenswrapper[4745]: I0319 00:21:45.261692 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:46 crc kubenswrapper[4745]: I0319 00:21:46.439206 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" probeResult="failure" output=< Mar 19 00:21:46 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:21:46 crc kubenswrapper[4745]: > Mar 19 00:21:49 crc kubenswrapper[4745]: I0319 00:21:49.299215 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.575287 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" event={"ID":"1340e1bb-a8aa-4a0c-b295-f49f94e81055","Type":"ContainerStarted","Data":"e327f92dbc5fff52d1912124f9ce2f0f29ee4ad9e54e3419624e6d02eb6e7ea1"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.586490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6edd3146-9ede-4a63-b72b-5987ed600bce","Type":"ContainerStarted","Data":"694ce7af8b3166a88da3080fef22e0db15c17a9128111ade8560fc1f11e9a583"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.593615 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" event={"ID":"9f239a73-6d47-4e9b-a74e-f97757ec8e4f","Type":"ContainerStarted","Data":"2755f249469ad287fc3eb9a78615601d6397f223089eb1ec2a6af5c32365dfda"} Mar 19 00:21:50 crc kubenswrapper[4745]: I0319 00:21:50.618371 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5586865c96-2hk9g" podStartSLOduration=7.82826577 podStartE2EDuration="23.618336028s" podCreationTimestamp="2026-03-19 00:21:27 +0000 UTC" firstStartedPulling="2026-03-19 00:21:33.23036525 +0000 UTC m=+857.768560381" lastFinishedPulling="2026-03-19 00:21:49.020435518 +0000 UTC m=+873.558630639" observedRunningTime="2026-03-19 00:21:50.608375956 +0000 UTC m=+875.146571097" watchObservedRunningTime="2026-03-19 00:21:50.618336028 +0000 UTC m=+875.156531169" Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.228533 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-f6stw" podStartSLOduration=22.903006186 podStartE2EDuration="1m2.228509265s" podCreationTimestamp="2026-03-19 00:20:53 +0000 UTC" firstStartedPulling="2026-03-19 00:21:09.694918149 +0000 UTC m=+834.233113280" lastFinishedPulling="2026-03-19 00:21:49.020421228 +0000 UTC m=+873.558616359" observedRunningTime="2026-03-19 00:21:50.637618259 +0000 UTC m=+875.175813390" watchObservedRunningTime="2026-03-19 00:21:55.228509265 +0000 UTC m=+879.766704396" Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.239237 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:21:55 crc kubenswrapper[4745]: I0319 00:21:55.785753 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.109086 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.207550 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.619801 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.621027 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.625336 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.625634 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-2x97z" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.626088 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.644494 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.761153 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.761554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.863737 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.863797 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.905948 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.911972 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtdsz\" (UniqueName: \"kubernetes.io/projected/9f7ceaac-a9f7-467b-83c9-298813ff6323-kube-api-access-mtdsz\") pod \"cert-manager-webhook-6888856db4-vr5md\" (UID: \"9f7ceaac-a9f7-467b-83c9-298813ff6323\") " pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:56 crc kubenswrapper[4745]: I0319 00:21:56.962242 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.244386 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.245262 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.247327 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-nv497" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.265767 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.375212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.375360 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.477747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.477838 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.497293 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.509685 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvxxn\" (UniqueName: \"kubernetes.io/projected/bbe8b718-863a-404e-9be9-e872318f1ac0-kube-api-access-xvxxn\") pod \"cert-manager-cainjector-5545bd876-cpzpl\" (UID: \"bbe8b718-863a-404e-9be9-e872318f1ac0\") " pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.566333 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.582233 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" containerID="cri-o://857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" gracePeriod=2 Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.653432 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.655338 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660173 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660407 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.660596 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.689080 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.782909 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783056 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783093 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783124 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783412 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783654 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783699 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783736 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783849 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.783926 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.885996 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886083 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886155 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886194 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886240 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886299 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886352 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886410 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886492 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886531 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886800 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.886926 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887214 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887637 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.887719 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.888140 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.888458 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.892466 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.893275 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.928481 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:57 crc kubenswrapper[4745]: I0319 00:21:57.933499 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.047501 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.737231 4745 generic.go:334] "Generic (PLEG): container finished" podID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" exitCode=0 Mar 19 00:21:58 crc kubenswrapper[4745]: I0319 00:21:58.737729 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1"} Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.155657 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.156587 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.156702 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.160787 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.161111 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.161801 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.322994 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.425336 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.452193 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"auto-csr-approver-29564662-znhfd\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:00 crc kubenswrapper[4745]: I0319 00:22:00.479459 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.262093 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.263931 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.264306 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:05 crc kubenswrapper[4745]: E0319 00:22:05.264413 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.761979 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.763068 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.766203 4745 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-r8wwt" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.793313 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.798389 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.798489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.899102 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.899213 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:06 crc kubenswrapper[4745]: I0319 00:22:06.922260 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-bound-sa-token\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:07 crc kubenswrapper[4745]: I0319 00:22:07.659650 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zqdw\" (UniqueName: \"kubernetes.io/projected/93f48ad8-0863-4d90-abac-b887096b386c-kube-api-access-8zqdw\") pod \"cert-manager-545d4d4674-ptrd5\" (UID: \"93f48ad8-0863-4d90-abac-b887096b386c\") " pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:07 crc kubenswrapper[4745]: I0319 00:22:07.688015 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-ptrd5" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.263498 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.266707 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.267335 4745 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.267385 4745 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-mtlwq" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.980752 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.981051 4745 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 00:22:15 crc kubenswrapper[4745]: init container &Container{Name:manage-dockerfile,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d7a908a23111a624c3fa04dc3105a7a97f48ee60105308bbb6ed42a40d63c2fe,Command:[],Args:[openshift-manage-dockerfile --v=0],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:BUILD,Value:{"kind":"Build","apiVersion":"build.openshift.io/v1","metadata":{"name":"service-telemetry-operator-1","namespace":"service-telemetry","uid":"6c226391-0d08-43e7-b93d-149a01173291","resourceVersion":"34861","generation":1,"creationTimestamp":"2026-03-19T00:21:44Z","labels":{"build":"service-telemetry-operator","buildconfig":"service-telemetry-operator","openshift.io/build-config.name":"service-telemetry-operator","openshift.io/build.start-policy":"Serial"},"annotations":{"openshift.io/build-config.name":"service-telemetry-operator","openshift.io/build.number":"1"},"ownerReferences":[{"apiVersion":"build.openshift.io/v1","kind":"BuildConfig","name":"service-telemetry-operator","uid":"aaf63d90-27c9-4514-a696-f2b813b6c2e3","controller":true}],"managedFields":[{"manager":"openshift-apiserver","operation":"Update","apiVersion":"build.openshift.io/v1","time":"2026-03-19T00:21:44Z","fieldsType":"FieldsV1","fieldsV1":{"f:metadata":{"f:annotations":{".":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.number":{}},"f:labels":{".":{},"f:build":{},"f:buildconfig":{},"f:openshift.io/build-config.name":{},"f:openshift.io/build.start-policy":{}},"f:ownerReferences":{".":{},"k:{\"uid\":\"aaf63d90-27c9-4514-a696-f2b813b6c2e3\"}":{}}},"f:spec":{"f:output":{"f:to":{}},"f:serviceAccount":{},"f:source":{"f:dockerfile":{},"f:type":{}},"f:strategy":{"f:dockerStrategy":{".":{},"f:from":{}},"f:type":{}},"f:triggeredBy":{}},"f:status":{"f:conditions":{".":{},"k:{\"type\":\"New\"}":{".":{},"f:lastTransitionTime":{},"f:lastUpdateTime":{},"f:status":{},"f:type":{}}},"f:config":{},"f:phase":{}}}}]},"spec":{"serviceAccount":"builder","source":{"type":"Dockerfile","dockerfile":"FROM quay.io/operator-framework/ansible-operator:v1.38.1\n\n# temporarily switch to root user to adjust image layers\nUSER 0\n# Upstream CI builds need the additional EPEL sources for python3-passlib and python3-bcrypt but have no working repos to install epel-release\n# NO_PROXY is undefined in upstream CI builds, but defined (usually blank) during openshift builds (a possibly brittle hack)\nRUN bash -c -- 'if [ \"${NO_PROXY:-__ZZZZZ}\" == \"__ZZZZZ\" ]; then echo \"Applying upstream EPEL hacks\" \u0026\u0026 echo -e \"-----BEGIN PGP PUBLIC KEY BLOCK-----\\nmQINBGE3mOsBEACsU+XwJWDJVkItBaugXhXIIkb9oe+7aadELuVo0kBmc3HXt/Yp\\nCJW9hHEiGZ6z2jwgPqyJjZhCvcAWvgzKcvqE+9i0NItV1rzfxrBe2BtUtZmVcuE6\\n2b+SPfxQ2Hr8llaawRjt8BCFX/ZzM4/1Qk+EzlfTcEcpkMf6wdO7kD6ulBk/tbsW\\nDHX2lNcxszTf+XP9HXHWJlA2xBfP+Dk4gl4DnO2Y1xR0OSywE/QtvEbN5cY94ieu\\nn7CBy29AleMhmbnx9pw3NyxcFIAsEZHJoU4ZW9ulAJ/ogttSyAWeacW7eJGW31/Z\\n39cS+I4KXJgeGRI20RmpqfH0tuT+X5Da59YpjYxkbhSK3HYBVnNPhoJFUc2j5iKy\\nXLgkapu1xRnEJhw05kr4LCbud0NTvfecqSqa+59kuVc+zWmfTnGTYc0PXZ6Oa3rK\\n44UOmE6eAT5zd/ToleDO0VesN+EO7CXfRsm7HWGpABF5wNK3vIEF2uRr2VJMvgqS\\n9eNwhJyOzoca4xFSwCkc6dACGGkV+CqhufdFBhmcAsUotSxe3zmrBjqA0B/nxIvH\\nDVgOAMnVCe+Lmv8T0mFgqZSJdIUdKjnOLu/GRFhjDKIak4jeMBMTYpVnU+HhMHLq\\nuDiZkNEvEEGhBQmZuI8J55F/a6UURnxUwT3piyi3Pmr2IFD7ahBxPzOBCQARAQAB\\ntCdGZWRvcmEgKGVwZWw5KSA8ZXBlbEBmZWRvcmFwcm9qZWN0Lm9yZz6JAk4EEwEI\\nADgWIQT/itE0RZcQbs6BO5GKOHK/MihGfAUCYTeY6wIbDwULCQgHAgYVCgkICwIE\\nFgIDAQIeAQIXgAAKCRCKOHK/MihGfFX/EACBPWv20+ttYu1A5WvtHJPzwbj0U4yF\\n3zTQpBglQ2UfkRpYdipTlT3Ih6j5h2VmgRPtINCc/ZE28adrWpBoeFIS2YAKOCLC\\nnZYtHl2nCoLq1U7FSttUGsZ/t8uGCBgnugTfnIYcmlP1jKKA6RJAclK89evDQX5n\\nR9ZD+Cq3CBMlttvSTCht0qQVlwycedH8iWyYgP/mF0W35BIn7NuuZwWhgR00n/VG\\n4nbKPOzTWbsP45awcmivdrS74P6mL84WfkghipdmcoyVb1B8ZP4Y/Ke0RXOnLhNe\\nCfrXXvuW+Pvg2RTfwRDtehGQPAgXbmLmz2ZkV69RGIr54HJv84NDbqZovRTMr7gL\\n9k3ciCzXCiYQgM8yAyGHV0KEhFSQ1HV7gMnt9UmxbxBE2pGU7vu3CwjYga5DpwU7\\nw5wu1TmM5KgZtZvuWOTDnqDLf0cKoIbW8FeeCOn24elcj32bnQDuF9DPey1mqcvT\\n/yEo/Ushyz6CVYxN8DGgcy2M9JOsnmjDx02h6qgWGWDuKgb9jZrvRedpAQCeemEd\\nfhEs6ihqVxRFl16HxC4EVijybhAL76SsM2nbtIqW1apBQJQpXWtQwwdvgTVpdEtE\\nr4ArVJYX5LrswnWEQMOelugUG6S3ZjMfcyOa/O0364iY73vyVgaYK+2XtT2usMux\\nVL469Kj5m13T6w==\\n=Mjs/\\n-----END PGP PUBLIC KEY BLOCK-----\" \u003e /etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-9 \u0026\u0026 echo -e \"[epel]\\nname=Extra Packages for Enterprise Linux 9 - \\$basearch\\nmetalink=https://mirrors.fedoraproject.org/metalink?repo=epel-9\u0026arch=\\$basearch\u0026infra=\\$infra\u0026content=\\$contentdir\\nenabled=1\\ngpgcheck=1\\ngpgkey=file:///etc/pki/rpm-gpg/RPM-GPG-KEY-EPEL-9\" \u003e /etc/yum.repos.d/epel.repo; fi'\n\n# update the base image to allow forward-looking optimistic updates during the testing phase, with the added benefit of helping move closer to passing security scans.\n# -- excludes ansible so it remains at 2.9 tag as shipped with the base image\n# -- installs python3-passlib and python3-bcrypt for oauth-proxy interface\n# -- cleans up the cached data from dnf to keep the image as small as possible\nRUN dnf update -y --exclude=ansible* \u0026\u0026 dnf install -y python3-passlib python3-bcrypt \u0026\u0026 dnf clean all \u0026\u0026 rm -rf /var/cache/dnf\n\nCOPY requirements.yml ${HOME}/requirements.yml\nRUN ansible-galaxy collection install -r ${HOME}/requirements.yml \\\n \u0026\u0026 chmod -R ug+rwx ${HOME}/.ansible\n\n# switch back to user 1001 when running the base image (non-root)\nUSER 1001\n\n# copy in required artifacts for the operator\nCOPY watches.yaml ${HOME}/watches.yaml\nCOPY roles/ ${HOME}/roles/\n"},"strategy":{"type":"Docker","dockerStrategy":{"from":{"kind":"DockerImage","name":"quay.io/operator-framework/ansible-operator@sha256:9895727b7f66bb88fa4c6afdefc7eecf86e6b7c1293920f866a035da9decc58e"},"pullSecret":{"name":"builder-dockercfg-vcnqb"}}},"output":{"to":{"kind":"DockerImage","name":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-operator:latest"},"pushSecret":{"name":"builder-dockercfg-vcnqb"}},"resources":{},"postCommit":{},"nodeSelector":null,"triggeredBy":[{"message":"Image change","imageChangeBuild":{"imageID":"quay.io/operator-framework/ansible-operator@sha256:9895727b7f66bb88fa4c6afdefc7eecf86e6b7c1293920f866a035da9decc58e","fromRef":{"kind":"ImageStreamTag","name":"ansible-operator:v1.38.1"}}}]},"status":{"phase":"New","outputDockerImageReference":"image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-operator:latest","config":{"kind":"BuildConfig","namespace":"service-telemetry","name":"service-telemetry-operator"},"output":{},"conditions":[{"type":"New","status":"True","lastUpdateTime":"2026-03-19T00:21:44Z","lastTransitionTime":"2026-03-19T00:21:44Z"}]}} Mar 19 00:22:15 crc kubenswrapper[4745]: ,ValueFrom:nil,},EnvVar{Name:LANG,Value:C.utf8,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/registries.conf,ValueFrom:nil,},EnvVar{Name:BUILD_REGISTRIES_DIR_PATH,Value:/var/run/configs/openshift.io/build-system/registries.d,ValueFrom:nil,},EnvVar{Name:BUILD_SIGNATURE_POLICY_PATH,Value:/var/run/configs/openshift.io/build-system/policy.json,ValueFrom:nil,},EnvVar{Name:BUILD_STORAGE_CONF_PATH,Value:/var/run/configs/openshift.io/build-system/storage.conf,ValueFrom:nil,},EnvVar{Name:BUILD_BLOBCACHE_DIR,Value:/var/cache/blobs,ValueFrom:nil,},EnvVar{Name:HTTP_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:http_proxy,Value:,ValueFrom:nil,},EnvVar{Name:HTTPS_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:https_proxy,Value:,ValueFrom:nil,},EnvVar{Name:NO_PROXY,Value:,ValueFrom:nil,},EnvVar{Name:no_proxy,Value:,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:buildworkdir,ReadOnly:false,MountPath:/tmp/build,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-system-configs,ReadOnly:true,MountPath:/var/run/configs/openshift.io/build-system,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-proxy-ca-bundles,ReadOnly:false,MountPath:/var/run/configs/openshift.io/pki,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:build-blob-cache,ReadOnly:false,MountPath:/var/cache/blobs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bwpcd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[CHOWN DAC_OVERRIDE],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-telemetry-operator-1-build_service-telemetry(6edd3146-9ede-4a63-b72b-5987ed600bce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 19 00:22:15 crc kubenswrapper[4745]: > logger="UnhandledError" Mar 19 00:22:15 crc kubenswrapper[4745]: E0319 00:22:15.982415 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manage-dockerfile\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/service-telemetry-operator-1-build" podUID="6edd3146-9ede-4a63-b72b-5987ed600bce" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.001983 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.184741 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.185268 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.185377 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") pod \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\" (UID: \"2886a73a-35b3-4014-ab67-1b88fa88b4d8\") " Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.200409 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities" (OuterVolumeSpecName: "utilities") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.293487 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm" (OuterVolumeSpecName: "kube-api-access-8hfvm") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "kube-api-access-8hfvm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.294443 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hfvm\" (UniqueName: \"kubernetes.io/projected/2886a73a-35b3-4014-ab67-1b88fa88b4d8-kube-api-access-8hfvm\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.294466 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.323801 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2886a73a-35b3-4014-ab67-1b88fa88b4d8" (UID: "2886a73a-35b3-4014-ab67-1b88fa88b4d8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.398003 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2886a73a-35b3-4014-ab67-1b88fa88b4d8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.829892 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mtlwq" Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.829841 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mtlwq" event={"ID":"2886a73a-35b3-4014-ab67-1b88fa88b4d8","Type":"ContainerDied","Data":"160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f"} Mar 19 00:22:16 crc kubenswrapper[4745]: I0319 00:22:16.830994 4745 scope.go:117] "RemoveContainer" containerID="857f6431c240a666d31a21fcfc4d6d6a9093efd57092d2eeb9cc92118fb64ee1" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.052106 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.060519 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mtlwq"] Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.185109 4745 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2886a73a_35b3_4014_ab67_1b88fa88b4d8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2886a73a_35b3_4014_ab67_1b88fa88b4d8.slice/crio-160fc9f43fde151a57b67e2c8925b999f7f111696e51e7dd87c757005709ba3f\": RecentStats: unable to find data in memory cache]" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.214648 4745 scope.go:117] "RemoveContainer" containerID="24e30658109b4babb9c20dba7ffc475b9dffa17f6513457f28d5dd1a43e9bcfb" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.254317 4745 scope.go:117] "RemoveContainer" containerID="4d2e067269ab8d6ad4ff9fd9c0c25e46f9248271c2c159ef4b71cf8df753032d" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.266656 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="registry.connect.redhat.com/elastic/elasticsearch:7.17.20" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.266924 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:elastic-internal-init-filesystem,Image:registry.connect.redhat.com/elastic/elasticsearch:7.17.20,Command:[bash -c /mnt/elastic-internal/scripts/prepare-fs.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:HEADLESS_SERVICE_NAME,Value:elasticsearch-es-default,ValueFrom:nil,},EnvVar{Name:PROBE_PASSWORD_PATH,Value:/mnt/elastic-internal/pod-mounted-users/elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:PROBE_USERNAME,Value:elastic-internal-probe,ValueFrom:nil,},EnvVar{Name:READINESS_PROBE_PROTOCOL,Value:https,ValueFrom:nil,},EnvVar{Name:NSS_SDB_USE_CACHE,Value:no,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Requests:ResourceList{cpu: {{100 -3} {} 100m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:downward-api,ReadOnly:true,MountPath:/mnt/elastic-internal/downward-api,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-bin-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-bin-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config,ReadOnly:true,MountPath:/mnt/elastic-internal/elasticsearch-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-config-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-config-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-elasticsearch-plugins-local,ReadOnly:false,MountPath:/mnt/elastic-internal/elasticsearch-plugins-local,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-http-certificates,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/http-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-probe-user,ReadOnly:true,MountPath:/mnt/elastic-internal/pod-mounted-users,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-remote-certificate-authorities,ReadOnly:true,MountPath:/usr/share/elasticsearch/config/transport-remote-certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-scripts,ReadOnly:true,MountPath:/mnt/elastic-internal/scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-transport-certificates,ReadOnly:true,MountPath:/mnt/elastic-internal/transport-certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-unicast-hosts,ReadOnly:true,MountPath:/mnt/elastic-internal/unicast-hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elastic-internal-xpack-file-realm,ReadOnly:true,MountPath:/mnt/elastic-internal/xpack-file-realm,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-data,ReadOnly:false,MountPath:/usr/share/elasticsearch/data,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:elasticsearch-logs,ReadOnly:false,MountPath:/usr/share/elasticsearch/logs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tmp-volume,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod elasticsearch-es-default-0_service-telemetry(884040c3-6c56-45b0-881d-e73f52c0ab34): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.268300 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.403339 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-cpzpl"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.449941 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.613981 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.619007 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddeed3a0c_ada3_41b5_895b_8acc45926539.slice/crio-384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5 WatchSource:0}: Error finding container 384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5: Status 404 returned error can't find the container with id 384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5 Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621453 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621500 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621553 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621637 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621657 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621729 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621774 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.621822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") pod \"6edd3146-9ede-4a63-b72b-5987ed600bce\" (UID: \"6edd3146-9ede-4a63-b72b-5987ed600bce\") " Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622054 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622179 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622192 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622205 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622228 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622267 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622340 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.622989 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.623673 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.625297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629043 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629093 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.629119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd" (OuterVolumeSpecName: "kube-api-access-bwpcd") pod "6edd3146-9ede-4a63-b72b-5987ed600bce" (UID: "6edd3146-9ede-4a63-b72b-5987ed600bce"). InnerVolumeSpecName "kube-api-access-bwpcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.688193 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-ptrd5"] Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.690777 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod93f48ad8_0863_4d90_abac_b887096b386c.slice/crio-b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970 WatchSource:0}: Error finding container b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970: Status 404 returned error can't find the container with id b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970 Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.691581 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9f7ceaac_a9f7_467b_83c9_298813ff6323.slice/crio-7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a WatchSource:0}: Error finding container 7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a: Status 404 returned error can't find the container with id 7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.698603 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vr5md"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.721382 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724157 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724187 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724196 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bwpcd\" (UniqueName: \"kubernetes.io/projected/6edd3146-9ede-4a63-b72b-5987ed600bce-kube-api-access-bwpcd\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724206 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724216 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6edd3146-9ede-4a63-b72b-5987ed600bce-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724224 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6edd3146-9ede-4a63-b72b-5987ed600bce-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724261 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724270 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6edd3146-9ede-4a63-b72b-5987ed600bce-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.724281 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6edd3146-9ede-4a63-b72b-5987ed600bce-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:17 crc kubenswrapper[4745]: W0319 00:22:17.725608 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9628a478_fb27_4c42_bcf5_2a329898708b.slice/crio-58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978 WatchSource:0}: Error finding container 58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978: Status 404 returned error can't find the container with id 58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978 Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.836819 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.839195 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" event={"ID":"9f7ceaac-a9f7-467b-83c9-298813ff6323","Type":"ContainerStarted","Data":"7d520b94b5e21d79e89e7ffd2add00bde982738351afc3753fbe52b023944e2a"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.843048 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" event={"ID":"bbe8b718-863a-404e-9be9-e872318f1ac0","Type":"ContainerStarted","Data":"a44f785d85411a8d3d7d159203cf5f427adb861697c350dd1de19f7a9fde89f7"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.845743 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.845742 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"6edd3146-9ede-4a63-b72b-5987ed600bce","Type":"ContainerDied","Data":"694ce7af8b3166a88da3080fef22e0db15c17a9128111ade8560fc1f11e9a583"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.847549 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerStarted","Data":"384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5"} Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.848838 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ptrd5" event={"ID":"93f48ad8-0863-4d90-abac-b887096b386c","Type":"ContainerStarted","Data":"b254ab1be0019ac2cfdcd3616e1d8a6f42bb2e1b3206f07749a8cbfc84261970"} Mar 19 00:22:17 crc kubenswrapper[4745]: E0319 00:22:17.850498 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.931264 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:22:17 crc kubenswrapper[4745]: I0319 00:22:17.937643 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.071732 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.103110 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.147937 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" path="/var/lib/kubelet/pods/2886a73a-35b3-4014-ab67-1b88fa88b4d8/volumes" Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.149045 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6edd3146-9ede-4a63-b72b-5987ed600bce" path="/var/lib/kubelet/pods/6edd3146-9ede-4a63-b72b-5987ed600bce/volumes" Mar 19 00:22:18 crc kubenswrapper[4745]: I0319 00:22:18.861646 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88"} Mar 19 00:22:18 crc kubenswrapper[4745]: E0319 00:22:18.864348 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:19 crc kubenswrapper[4745]: I0319 00:22:19.871807 4745 generic.go:334] "Generic (PLEG): container finished" podID="deed3a0c-ada3-41b5-895b-8acc45926539" containerID="7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84" exitCode=0 Mar 19 00:22:19 crc kubenswrapper[4745]: I0319 00:22:19.871842 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerDied","Data":"7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84"} Mar 19 00:22:19 crc kubenswrapper[4745]: E0319 00:22:19.874852 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"elastic-internal-init-filesystem\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.connect.redhat.com/elastic/elasticsearch:7.17.20\\\"\"" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.249216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.306284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") pod \"deed3a0c-ada3-41b5-895b-8acc45926539\" (UID: \"deed3a0c-ada3-41b5-895b-8acc45926539\") " Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.313392 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv" (OuterVolumeSpecName: "kube-api-access-gwsgv") pod "deed3a0c-ada3-41b5-895b-8acc45926539" (UID: "deed3a0c-ada3-41b5-895b-8acc45926539"). InnerVolumeSpecName "kube-api-access-gwsgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.407928 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwsgv\" (UniqueName: \"kubernetes.io/projected/deed3a0c-ada3-41b5-895b-8acc45926539-kube-api-access-gwsgv\") on node \"crc\" DevicePath \"\"" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.902504 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" event={"ID":"bbe8b718-863a-404e-9be9-e872318f1ac0","Type":"ContainerStarted","Data":"4a60e5014334d2bfaa36f00bc65a9db1478d4b51d8a1b3468b45b453adce650a"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904279 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564662-znhfd" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904267 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564662-znhfd" event={"ID":"deed3a0c-ada3-41b5-895b-8acc45926539","Type":"ContainerDied","Data":"384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.904395 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="384224cc8f809ba00927bca54562c15dd639f37fb13daad965dbd985c43071e5" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.906007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-ptrd5" event={"ID":"93f48ad8-0863-4d90-abac-b887096b386c","Type":"ContainerStarted","Data":"0ce1cf8b1ab7f41eba4be95314b4d5e70806db344dc2f5207a232338609b2723"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.907580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" event={"ID":"9f7ceaac-a9f7-467b-83c9-298813ff6323","Type":"ContainerStarted","Data":"ed08192019c25324529343b9fe7d94e2e2866ff48cd56fb32836733501c7f185"} Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.907690 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.925149 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-cpzpl" podStartSLOduration=21.035819479 podStartE2EDuration="25.925122534s" podCreationTimestamp="2026-03-19 00:21:57 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.425036703 +0000 UTC m=+901.963231834" lastFinishedPulling="2026-03-19 00:22:22.314339758 +0000 UTC m=+906.852534889" observedRunningTime="2026-03-19 00:22:22.921533251 +0000 UTC m=+907.459728392" watchObservedRunningTime="2026-03-19 00:22:22.925122534 +0000 UTC m=+907.463317665" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.952460 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-ptrd5" podStartSLOduration=12.302124814999999 podStartE2EDuration="16.952430695s" podCreationTimestamp="2026-03-19 00:22:06 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.69325549 +0000 UTC m=+902.231450621" lastFinishedPulling="2026-03-19 00:22:22.34356137 +0000 UTC m=+906.881756501" observedRunningTime="2026-03-19 00:22:22.95224128 +0000 UTC m=+907.490436411" watchObservedRunningTime="2026-03-19 00:22:22.952430695 +0000 UTC m=+907.490625826" Mar 19 00:22:22 crc kubenswrapper[4745]: I0319 00:22:22.985698 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" podStartSLOduration=22.358471483 podStartE2EDuration="26.985671442s" podCreationTimestamp="2026-03-19 00:21:56 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.693538759 +0000 UTC m=+902.231733890" lastFinishedPulling="2026-03-19 00:22:22.320738718 +0000 UTC m=+906.858933849" observedRunningTime="2026-03-19 00:22:22.981292206 +0000 UTC m=+907.519487337" watchObservedRunningTime="2026-03-19 00:22:22.985671442 +0000 UTC m=+907.523866573" Mar 19 00:22:23 crc kubenswrapper[4745]: I0319 00:22:23.316720 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:22:23 crc kubenswrapper[4745]: I0319 00:22:23.347817 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564656-7jksw"] Mar 19 00:22:24 crc kubenswrapper[4745]: I0319 00:22:24.146783 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1" path="/var/lib/kubelet/pods/f2521a0e-7dcb-48a5-8bd3-9f60f5458dc1/volumes" Mar 19 00:22:28 crc kubenswrapper[4745]: I0319 00:22:28.953684 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88" exitCode=0 Mar 19 00:22:28 crc kubenswrapper[4745]: I0319 00:22:28.953817 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"dd061ee450110ddfede6713e7818b0934e91c5d3775e82e313f1bc2da43e1a88"} Mar 19 00:22:29 crc kubenswrapper[4745]: I0319 00:22:29.962293 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="c30d083389fd186a7069c7aaf9d05af719cf1d4cce9883d54a6420598b98e1e5" exitCode=0 Mar 19 00:22:29 crc kubenswrapper[4745]: I0319 00:22:29.962390 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"c30d083389fd186a7069c7aaf9d05af719cf1d4cce9883d54a6420598b98e1e5"} Mar 19 00:22:30 crc kubenswrapper[4745]: I0319 00:22:30.023488 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_9628a478-fb27-4c42-bcf5-2a329898708b/manage-dockerfile/0.log" Mar 19 00:22:30 crc kubenswrapper[4745]: I0319 00:22:30.972462 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerStarted","Data":"b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c"} Mar 19 00:22:31 crc kubenswrapper[4745]: I0319 00:22:31.006727 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-2-build" podStartSLOduration=33.482700093 podStartE2EDuration="34.006704941s" podCreationTimestamp="2026-03-19 00:21:57 +0000 UTC" firstStartedPulling="2026-03-19 00:22:17.729008046 +0000 UTC m=+902.267203177" lastFinishedPulling="2026-03-19 00:22:18.253012894 +0000 UTC m=+902.791208025" observedRunningTime="2026-03-19 00:22:31.004486352 +0000 UTC m=+915.542681483" watchObservedRunningTime="2026-03-19 00:22:31.006704941 +0000 UTC m=+915.544900072" Mar 19 00:22:31 crc kubenswrapper[4745]: I0319 00:22:31.964985 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-vr5md" Mar 19 00:22:34 crc kubenswrapper[4745]: I0319 00:22:34.998239 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71"} Mar 19 00:22:35 crc kubenswrapper[4745]: I0319 00:22:35.001312 4745 scope.go:117] "RemoveContainer" containerID="4cf1138f66461b0db8f8d82a562c5595a0d75aaab97369e7761de279cdf0fb9b" Mar 19 00:22:36 crc kubenswrapper[4745]: I0319 00:22:36.008666 4745 generic.go:334] "Generic (PLEG): container finished" podID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerID="31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71" exitCode=0 Mar 19 00:22:36 crc kubenswrapper[4745]: I0319 00:22:36.008757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerDied","Data":"31a8a2e6d26279247a06c5389e7cc4810d760e5b6b34dc49c80127d9e15a8c71"} Mar 19 00:22:37 crc kubenswrapper[4745]: I0319 00:22:37.190080 4745 generic.go:334] "Generic (PLEG): container finished" podID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerID="759c92c7a60155345946936f340a97217251cee94b6ed490cd8f7fb3932d4668" exitCode=0 Mar 19 00:22:37 crc kubenswrapper[4745]: I0319 00:22:37.190158 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerDied","Data":"759c92c7a60155345946936f340a97217251cee94b6ed490cd8f7fb3932d4668"} Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.198638 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"884040c3-6c56-45b0-881d-e73f52c0ab34","Type":"ContainerStarted","Data":"f7ef869940915ea60ea453e1e20afbd2d82e3f9a1d01d7292debb45a26fd780c"} Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.198959 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:22:38 crc kubenswrapper[4745]: I0319 00:22:38.241697 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=19.546775413 podStartE2EDuration="1m20.241667527s" podCreationTimestamp="2026-03-19 00:21:18 +0000 UTC" firstStartedPulling="2026-03-19 00:21:33.227490461 +0000 UTC m=+857.765685602" lastFinishedPulling="2026-03-19 00:22:33.922382585 +0000 UTC m=+918.460577716" observedRunningTime="2026-03-19 00:22:38.232400758 +0000 UTC m=+922.770595919" watchObservedRunningTime="2026-03-19 00:22:38.241667527 +0000 UTC m=+922.779862678" Mar 19 00:22:49 crc kubenswrapper[4745]: I0319 00:22:49.205140 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:49 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:49+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:49 crc kubenswrapper[4745]: > Mar 19 00:22:54 crc kubenswrapper[4745]: I0319 00:22:54.403841 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:54 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:54+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:54 crc kubenswrapper[4745]: > Mar 19 00:22:59 crc kubenswrapper[4745]: I0319 00:22:59.186870 4745 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="884040c3-6c56-45b0-881d-e73f52c0ab34" containerName="elasticsearch" probeResult="failure" output=< Mar 19 00:22:59 crc kubenswrapper[4745]: {"timestamp": "2026-03-19T00:22:59+00:00", "message": "readiness probe failed", "curl_rc": "7"} Mar 19 00:22:59 crc kubenswrapper[4745]: > Mar 19 00:23:04 crc kubenswrapper[4745]: I0319 00:23:04.578692 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.266856 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267925 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.267943 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267959 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-utilities" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.267968 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-utilities" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.267994 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-content" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268002 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="extract-content" Mar 19 00:23:40 crc kubenswrapper[4745]: E0319 00:23:40.268014 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268020 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268154 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" containerName="oc" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.268179 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2886a73a-35b3-4014-ab67-1b88fa88b4d8" containerName="registry-server" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.269200 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.285094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390145 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390224 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.390308 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491093 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491472 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.491582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.492019 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.492043 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.511375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"community-operators-b6sd9\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.588003 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:40 crc kubenswrapper[4745]: I0319 00:23:40.928716 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838652 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535" exitCode=0 Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535"} Mar 19 00:23:41 crc kubenswrapper[4745]: I0319 00:23:41.838750 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"e8fbd88069b4baabfb51a65627fb36b764b5adb30577d5155d7c9e993c63fb41"} Mar 19 00:23:42 crc kubenswrapper[4745]: I0319 00:23:42.846132 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b"} Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.044937 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.047041 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.062537 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233559 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233769 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.233854 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335268 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335355 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.335752 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-utilities\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.336009 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-catalog-content\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.360279 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7wh5\" (UniqueName: \"kubernetes.io/projected/7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b-kube-api-access-r7wh5\") pod \"certified-operators-s8x87\" (UID: \"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b\") " pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.366088 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.857414 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b" exitCode=0 Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.857897 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b"} Mar 19 00:23:43 crc kubenswrapper[4745]: I0319 00:23:43.894552 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.887289 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerStarted","Data":"b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.890070 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b" containerID="7bd9ccf914f960b4e9d8c85316c1ce12252a1497a9241b7b47ad5214ad9c9cc1" exitCode=0 Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.890164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerDied","Data":"7bd9ccf914f960b4e9d8c85316c1ce12252a1497a9241b7b47ad5214ad9c9cc1"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.891068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"f0786173a026e18ad20f4f30038dc776483d295db9841f6a9b8b8bd9268114cd"} Mar 19 00:23:44 crc kubenswrapper[4745]: I0319 00:23:44.912658 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b6sd9" podStartSLOduration=2.474492893 podStartE2EDuration="4.912636918s" podCreationTimestamp="2026-03-19 00:23:40 +0000 UTC" firstStartedPulling="2026-03-19 00:23:41.840766245 +0000 UTC m=+986.378961376" lastFinishedPulling="2026-03-19 00:23:44.27891027 +0000 UTC m=+988.817105401" observedRunningTime="2026-03-19 00:23:44.909304204 +0000 UTC m=+989.447499345" watchObservedRunningTime="2026-03-19 00:23:44.912636918 +0000 UTC m=+989.450832059" Mar 19 00:23:45 crc kubenswrapper[4745]: I0319 00:23:45.606478 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:23:45 crc kubenswrapper[4745]: I0319 00:23:45.606571 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.589045 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.589813 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:50 crc kubenswrapper[4745]: I0319 00:23:50.781736 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:51 crc kubenswrapper[4745]: I0319 00:23:51.028527 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:51 crc kubenswrapper[4745]: I0319 00:23:51.098757 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:53 crc kubenswrapper[4745]: I0319 00:23:53.006151 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b6sd9" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" containerID="cri-o://b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" gracePeriod=2 Mar 19 00:23:54 crc kubenswrapper[4745]: I0319 00:23:54.017249 4745 generic.go:334] "Generic (PLEG): container finished" podID="23856082-7489-4bd9-8561-9492d211f62f" containerID="b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" exitCode=0 Mar 19 00:23:54 crc kubenswrapper[4745]: I0319 00:23:54.017322 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064"} Mar 19 00:23:55 crc kubenswrapper[4745]: I0319 00:23:55.966116 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093699 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093871 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.093928 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") pod \"23856082-7489-4bd9-8561-9492d211f62f\" (UID: \"23856082-7489-4bd9-8561-9492d211f62f\") " Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.098445 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities" (OuterVolumeSpecName: "utilities") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100113 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b6sd9" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100126 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b6sd9" event={"ID":"23856082-7489-4bd9-8561-9492d211f62f","Type":"ContainerDied","Data":"e8fbd88069b4baabfb51a65627fb36b764b5adb30577d5155d7c9e993c63fb41"} Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.100197 4745 scope.go:117] "RemoveContainer" containerID="b56c754a942cf08b52d13bb31750aab1a9f071be41bf7e22cb0202eb991d6064" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.105013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb"} Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.107178 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c" (OuterVolumeSpecName: "kube-api-access-9jc9c") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "kube-api-access-9jc9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.135260 4745 scope.go:117] "RemoveContainer" containerID="ff7a3fc1461925c1f307f9e0214f0f9f539b3c62a6d1fccb088ba8d69a4dd25b" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.156605 4745 scope.go:117] "RemoveContainer" containerID="c31f53226edb12a4c47ac12a63ba58b187d38f0846755a55d90f6873ac538535" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.170115 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23856082-7489-4bd9-8561-9492d211f62f" (UID: "23856082-7489-4bd9-8561-9492d211f62f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.195945 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.195993 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jc9c\" (UniqueName: \"kubernetes.io/projected/23856082-7489-4bd9-8561-9492d211f62f-kube-api-access-9jc9c\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.196004 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23856082-7489-4bd9-8561-9492d211f62f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.433518 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:56 crc kubenswrapper[4745]: I0319 00:23:56.447852 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b6sd9"] Mar 19 00:23:57 crc kubenswrapper[4745]: I0319 00:23:57.115851 4745 generic.go:334] "Generic (PLEG): container finished" podID="7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b" containerID="b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb" exitCode=0 Mar 19 00:23:57 crc kubenswrapper[4745]: I0319 00:23:57.115929 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerDied","Data":"b1f131ab406d856dab787928a0854b0baa6f609e17dd782cdce42e98836eb0fb"} Mar 19 00:23:58 crc kubenswrapper[4745]: I0319 00:23:58.147729 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23856082-7489-4bd9-8561-9492d211f62f" path="/var/lib/kubelet/pods/23856082-7489-4bd9-8561-9492d211f62f/volumes" Mar 19 00:23:59 crc kubenswrapper[4745]: I0319 00:23:59.135580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s8x87" event={"ID":"7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b","Type":"ContainerStarted","Data":"f0ace3c9cf8d5ce03926099d2106cc64f7295db6ffdd337710c9734b56b66f65"} Mar 19 00:23:59 crc kubenswrapper[4745]: I0319 00:23:59.159556 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s8x87" podStartSLOduration=2.8302934950000003 podStartE2EDuration="16.159529669s" podCreationTimestamp="2026-03-19 00:23:43 +0000 UTC" firstStartedPulling="2026-03-19 00:23:44.899662986 +0000 UTC m=+989.437858117" lastFinishedPulling="2026-03-19 00:23:58.22889916 +0000 UTC m=+1002.767094291" observedRunningTime="2026-03-19 00:23:59.156771583 +0000 UTC m=+1003.694966724" watchObservedRunningTime="2026-03-19 00:23:59.159529669 +0000 UTC m=+1003.697724850" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148297 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148795 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-content" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148812 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-content" Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148832 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148839 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: E0319 00:24:00.148852 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-utilities" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148858 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="extract-utilities" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.148986 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="23856082-7489-4bd9-8561-9492d211f62f" containerName="registry-server" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.149466 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.151969 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.152949 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.154287 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.168094 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.253744 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.355372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.400549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"auto-csr-approver-29564664-hhtnq\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.467732 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:00 crc kubenswrapper[4745]: I0319 00:24:00.744278 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:24:01 crc kubenswrapper[4745]: I0319 00:24:01.149440 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerStarted","Data":"742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a"} Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.165538 4745 generic.go:334] "Generic (PLEG): container finished" podID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerID="102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972" exitCode=0 Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.165642 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerDied","Data":"102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972"} Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.366928 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.366991 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:03 crc kubenswrapper[4745]: I0319 00:24:03.412820 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.214568 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s8x87" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.328366 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s8x87"] Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.371951 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.372297 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vj7rp" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" containerID="cri-o://d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" gracePeriod=2 Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.773568 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930733 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930875 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.930949 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") pod \"fa950165-f194-4022-8333-581d7681fc74\" (UID: \"fa950165-f194-4022-8333-581d7681fc74\") " Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.933173 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities" (OuterVolumeSpecName: "utilities") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.938314 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b" (OuterVolumeSpecName: "kube-api-access-q6z2b") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "kube-api-access-q6z2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:04 crc kubenswrapper[4745]: I0319 00:24:04.986633 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa950165-f194-4022-8333-581d7681fc74" (UID: "fa950165-f194-4022-8333-581d7681fc74"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033068 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033126 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6z2b\" (UniqueName: \"kubernetes.io/projected/fa950165-f194-4022-8333-581d7681fc74-kube-api-access-q6z2b\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.033143 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa950165-f194-4022-8333-581d7681fc74-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183069 4745 generic.go:334] "Generic (PLEG): container finished" podID="fa950165-f194-4022-8333-581d7681fc74" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" exitCode=0 Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183141 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vj7rp" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183157 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183211 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vj7rp" event={"ID":"fa950165-f194-4022-8333-581d7681fc74","Type":"ContainerDied","Data":"c186af9e1f5669e5491c37e499f3a6a8a28b64cddfa66b87effddfaec8dbd826"} Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.183233 4745 scope.go:117] "RemoveContainer" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.200742 4745 scope.go:117] "RemoveContainer" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.227046 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.227181 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vj7rp"] Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.247068 4745 scope.go:117] "RemoveContainer" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260002 4745 scope.go:117] "RemoveContainer" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.260434 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": container with ID starting with d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f not found: ID does not exist" containerID="d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260494 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f"} err="failed to get container status \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": rpc error: code = NotFound desc = could not find container \"d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f\": container with ID starting with d230cfe4a14ca254e7131defa3b7ece4cc42435cb44ea389daf0ff158fd62d0f not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260527 4745 scope.go:117] "RemoveContainer" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.260948 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": container with ID starting with e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0 not found: ID does not exist" containerID="e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.260983 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0"} err="failed to get container status \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": rpc error: code = NotFound desc = could not find container \"e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0\": container with ID starting with e0c2c64d91d54bad025206ae8e889ae3bcae29781d50b930918ad5539a0294b0 not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.261008 4745 scope.go:117] "RemoveContainer" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: E0319 00:24:05.261328 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": container with ID starting with 04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc not found: ID does not exist" containerID="04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.261361 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc"} err="failed to get container status \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": rpc error: code = NotFound desc = could not find container \"04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc\": container with ID starting with 04cbc60e7b7444e8ef8e432f63858b2c91b45f5ff16eaf690cc1570df3f885fc not found: ID does not exist" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.414367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.539001 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") pod \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\" (UID: \"d9a8819f-c57d-463c-9089-fbf3b29e12bc\") " Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.554449 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58" (OuterVolumeSpecName: "kube-api-access-4zj58") pod "d9a8819f-c57d-463c-9089-fbf3b29e12bc" (UID: "d9a8819f-c57d-463c-9089-fbf3b29e12bc"). InnerVolumeSpecName "kube-api-access-4zj58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:05 crc kubenswrapper[4745]: I0319 00:24:05.640807 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4zj58\" (UniqueName: \"kubernetes.io/projected/d9a8819f-c57d-463c-9089-fbf3b29e12bc-kube-api-access-4zj58\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.146006 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa950165-f194-4022-8333-581d7681fc74" path="/var/lib/kubelet/pods/fa950165-f194-4022-8333-581d7681fc74/volumes" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.191963 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.191989 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564664-hhtnq" event={"ID":"d9a8819f-c57d-463c-9089-fbf3b29e12bc","Type":"ContainerDied","Data":"742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a"} Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.192499 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="742098faa80518f702910fff7c9cc0a70dc0fad59e218550a4356a422ed77c0a" Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.492522 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:24:06 crc kubenswrapper[4745]: I0319 00:24:06.496989 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564658-6vdd5"] Mar 19 00:24:08 crc kubenswrapper[4745]: I0319 00:24:08.146176 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7807a7d0-ff52-4a76-b083-19eca144b510" path="/var/lib/kubelet/pods/7807a7d0-ff52-4a76-b083-19eca144b510/volumes" Mar 19 00:24:15 crc kubenswrapper[4745]: I0319 00:24:15.606780 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:24:15 crc kubenswrapper[4745]: I0319 00:24:15.607659 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:24:21 crc kubenswrapper[4745]: I0319 00:24:21.294297 4745 generic.go:334] "Generic (PLEG): container finished" podID="9628a478-fb27-4c42-bcf5-2a329898708b" containerID="b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c" exitCode=0 Mar 19 00:24:21 crc kubenswrapper[4745]: I0319 00:24:21.294384 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"b4703b83174ab7b2103b6b149b2ad8cb67a489bfe97d982169000bed076edd3c"} Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.573748 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698305 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698366 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698399 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698420 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698447 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698487 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698508 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698537 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698682 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698718 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.698815 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") pod \"9628a478-fb27-4c42-bcf5-2a329898708b\" (UID: \"9628a478-fb27-4c42-bcf5-2a329898708b\") " Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699060 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699166 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699184 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9628a478-fb27-4c42-bcf5-2a329898708b-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.699763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700248 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700628 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.700793 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708068 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708102 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.708121 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5" (OuterVolumeSpecName: "kube-api-access-zq4w5") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "kube-api-access-zq4w5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.734637 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800737 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800772 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800783 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800792 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800801 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800809 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9628a478-fb27-4c42-bcf5-2a329898708b-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800819 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9628a478-fb27-4c42-bcf5-2a329898708b-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.800828 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq4w5\" (UniqueName: \"kubernetes.io/projected/9628a478-fb27-4c42-bcf5-2a329898708b-kube-api-access-zq4w5\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.870977 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:22 crc kubenswrapper[4745]: I0319 00:24:22.901637 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311461 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9628a478-fb27-4c42-bcf5-2a329898708b","Type":"ContainerDied","Data":"58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978"} Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311516 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b8553adc548ce0bd4cb3a6183d89b4d03da6f1f5a7173d3c08d371d62ae978" Mar 19 00:24:23 crc kubenswrapper[4745]: I0319 00:24:23.311560 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Mar 19 00:24:24 crc kubenswrapper[4745]: I0319 00:24:24.494469 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9628a478-fb27-4c42-bcf5-2a329898708b" (UID: "9628a478-fb27-4c42-bcf5-2a329898708b"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:24 crc kubenswrapper[4745]: I0319 00:24:24.528615 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9628a478-fb27-4c42-bcf5-2a329898708b-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.109502 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110479 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110496 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110508 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110515 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110525 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="git-clone" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110535 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="git-clone" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110549 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-utilities" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110555 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-utilities" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110575 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110582 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110593 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-content" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110601 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="extract-content" Mar 19 00:24:27 crc kubenswrapper[4745]: E0319 00:24:27.110611 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="manage-dockerfile" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110618 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="manage-dockerfile" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110747 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa950165-f194-4022-8333-581d7681fc74" containerName="registry-server" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110773 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" containerName="oc" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.110783 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9628a478-fb27-4c42-bcf5-2a329898708b" containerName="docker-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.111638 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115378 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-ca" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115448 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-sys-config" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115706 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.115852 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-1-global-ca" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.125599 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166721 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166766 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166841 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.166932 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167021 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167229 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167322 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167395 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167536 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.167584 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268540 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268592 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268625 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268649 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268681 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268701 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268716 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268740 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268773 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268796 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268817 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268841 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.268947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269092 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269418 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269748 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.269927 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270144 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270234 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270373 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.270840 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.275612 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.276687 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.284915 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"smart-gateway-operator-1-build\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.435085 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:27 crc kubenswrapper[4745]: I0319 00:24:27.640534 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345488 4745 generic.go:334] "Generic (PLEG): container finished" podID="0383e703-f206-4571-8ca3-be59433df02c" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" exitCode=0 Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345547 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82"} Mar 19 00:24:28 crc kubenswrapper[4745]: I0319 00:24:28.345581 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerStarted","Data":"137b1e8f43fe79dfa16fb03e4c3e074784118642c8f0428bdc642efcff9cad0a"} Mar 19 00:24:29 crc kubenswrapper[4745]: I0319 00:24:29.356279 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerStarted","Data":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} Mar 19 00:24:29 crc kubenswrapper[4745]: I0319 00:24:29.390578 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-1-build" podStartSLOduration=2.390549957 podStartE2EDuration="2.390549957s" podCreationTimestamp="2026-03-19 00:24:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:24:29.381637711 +0000 UTC m=+1033.919832852" watchObservedRunningTime="2026-03-19 00:24:29.390549957 +0000 UTC m=+1033.928745088" Mar 19 00:24:35 crc kubenswrapper[4745]: I0319 00:24:35.100728 4745 scope.go:117] "RemoveContainer" containerID="91311d7617172e5175d1b2c1df977704664ce95b1113f4d27a4b6a3392f4c27c" Mar 19 00:24:37 crc kubenswrapper[4745]: I0319 00:24:37.681373 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:37 crc kubenswrapper[4745]: I0319 00:24:37.682932 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/smart-gateway-operator-1-build" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" containerID="cri-o://b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" gracePeriod=30 Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.308668 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.310341 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.313099 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-sys-config" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.313170 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-global-ca" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.315409 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-2-ca" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.341014 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447825 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.447997 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448800 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.448951 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449119 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449166 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449189 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449214 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.449239 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551285 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551361 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551403 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551486 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551532 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551539 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551562 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551674 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551706 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551749 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551783 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.551874 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552003 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552091 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552312 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552818 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.552931 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.553023 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.553036 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.558937 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.558958 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.575846 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"smart-gateway-operator-2-build\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.632417 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:24:39 crc kubenswrapper[4745]: I0319 00:24:39.865508 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-2-build"] Mar 19 00:24:39 crc kubenswrapper[4745]: W0319 00:24:39.871738 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c3be06c_b88a_4749_b788_876b92486d65.slice/crio-1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60 WatchSource:0}: Error finding container 1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60: Status 404 returned error can't find the container with id 1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60 Mar 19 00:24:40 crc kubenswrapper[4745]: I0319 00:24:40.445920 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60"} Mar 19 00:24:41 crc kubenswrapper[4745]: I0319 00:24:41.453936 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.190438 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_0383e703-f206-4571-8ca3-be59433df02c/docker-build/0.log" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.191399 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.302390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.302467 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303603 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303653 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303693 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303723 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303751 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303812 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303946 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.303974 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") pod \"0383e703-f206-4571-8ca3-be59433df02c\" (UID: \"0383e703-f206-4571-8ca3-be59433df02c\") " Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304153 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304525 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.304971 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.305104 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.307681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.307719 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.308564 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313161 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313189 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw" (OuterVolumeSpecName: "kube-api-access-mxdzw") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "kube-api-access-mxdzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.313643 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.404678 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxdzw\" (UniqueName: \"kubernetes.io/projected/0383e703-f206-4571-8ca3-be59433df02c-kube-api-access-mxdzw\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405030 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405131 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/0383e703-f206-4571-8ca3-be59433df02c-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405195 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405250 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405345 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405415 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/0383e703-f206-4571-8ca3-be59433df02c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405491 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405549 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.405601 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/0383e703-f206-4571-8ca3-be59433df02c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.447860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463283 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-1-build_0383e703-f206-4571-8ca3-be59433df02c/docker-build/0.log" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463859 4745 generic.go:334] "Generic (PLEG): container finished" podID="0383e703-f206-4571-8ca3-be59433df02c" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" exitCode=1 Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.463983 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-1-build" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464028 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464064 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-1-build" event={"ID":"0383e703-f206-4571-8ca3-be59433df02c","Type":"ContainerDied","Data":"137b1e8f43fe79dfa16fb03e4c3e074784118642c8f0428bdc642efcff9cad0a"} Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.464084 4745 scope.go:117] "RemoveContainer" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.507397 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.531912 4745 scope.go:117] "RemoveContainer" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.569865 4745 scope.go:117] "RemoveContainer" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: E0319 00:24:42.570870 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": container with ID starting with b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75 not found: ID does not exist" containerID="b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.570925 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75"} err="failed to get container status \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": rpc error: code = NotFound desc = could not find container \"b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75\": container with ID starting with b638bea281f4c491357e07c0f602e02b4832e25d7f783393f0360cac0fedde75 not found: ID does not exist" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.570962 4745 scope.go:117] "RemoveContainer" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: E0319 00:24:42.571245 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": container with ID starting with 597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82 not found: ID does not exist" containerID="597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.571838 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82"} err="failed to get container status \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": rpc error: code = NotFound desc = could not find container \"597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82\": container with ID starting with 597e164be2d5243ea949eb722a24e3bbf671180a5db2f0f227553d09e1d12e82 not found: ID does not exist" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.698315 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "0383e703-f206-4571-8ca3-be59433df02c" (UID: "0383e703-f206-4571-8ca3-be59433df02c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.710648 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/0383e703-f206-4571-8ca3-be59433df02c-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.797281 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:42 crc kubenswrapper[4745]: I0319 00:24:42.803129 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-1-build"] Mar 19 00:24:43 crc kubenswrapper[4745]: I0319 00:24:43.473549 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86" exitCode=0 Mar 19 00:24:43 crc kubenswrapper[4745]: I0319 00:24:43.473605 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"1f712e863eed19f54a1076d7614b884aef6cf3e4d828606ee2cf52d7ed11bf86"} Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.146824 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0383e703-f206-4571-8ca3-be59433df02c" path="/var/lib/kubelet/pods/0383e703-f206-4571-8ca3-be59433df02c/volumes" Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.483426 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="8387abf70bd6b56edb79f7d09a543aa55e696cf32abc591019ab21a211e52480" exitCode=0 Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.483835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"8387abf70bd6b56edb79f7d09a543aa55e696cf32abc591019ab21a211e52480"} Mar 19 00:24:44 crc kubenswrapper[4745]: I0319 00:24:44.520563 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-2-build_7c3be06c-b88a-4749-b788-876b92486d65/manage-dockerfile/0.log" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.510002 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerStarted","Data":"2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13"} Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.556620 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-2-build" podStartSLOduration=6.556565649 podStartE2EDuration="6.556565649s" podCreationTimestamp="2026-03-19 00:24:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:24:45.548342895 +0000 UTC m=+1050.086538036" watchObservedRunningTime="2026-03-19 00:24:45.556565649 +0000 UTC m=+1050.094760780" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606038 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606136 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.606207 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.607384 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:24:45 crc kubenswrapper[4745]: I0319 00:24:45.607464 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" gracePeriod=600 Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520192 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" exitCode=0 Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520265 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06"} Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520724 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} Mar 19 00:24:46 crc kubenswrapper[4745]: I0319 00:24:46.520780 4745 scope.go:117] "RemoveContainer" containerID="c5fdbb323d46092e1c016d6578086be40b0bfff7d18498b3c00eacd6a8fbf018" Mar 19 00:25:50 crc kubenswrapper[4745]: I0319 00:25:50.990546 4745 generic.go:334] "Generic (PLEG): container finished" podID="7c3be06c-b88a-4749-b788-876b92486d65" containerID="2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13" exitCode=0 Mar 19 00:25:50 crc kubenswrapper[4745]: I0319 00:25:50.990718 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"2ab2822f8ced8f2081870ab6c5f34700f462d5ced58b7c2cbf1f5d29b9599f13"} Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.250168 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358200 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358239 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358295 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359422 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.358395 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359525 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359593 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359557 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359678 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359716 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360958 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360997 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.359976 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360032 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.360711 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.361023 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") pod \"7c3be06c-b88a-4749-b788-876b92486d65\" (UID: \"7c3be06c-b88a-4749-b788-876b92486d65\") " Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362724 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362924 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362942 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362955 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362966 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7c3be06c-b88a-4749-b788-876b92486d65-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362976 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7c3be06c-b88a-4749-b788-876b92486d65-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.362987 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.364123 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.367105 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.367230 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4" (OuterVolumeSpecName: "kube-api-access-mhxg4") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "kube-api-access-mhxg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.368284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465075 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465109 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7c3be06c-b88a-4749-b788-876b92486d65-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465119 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxg4\" (UniqueName: \"kubernetes.io/projected/7c3be06c-b88a-4749-b788-876b92486d65-kube-api-access-mhxg4\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.465130 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.537107 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:52 crc kubenswrapper[4745]: I0319 00:25:52.565536 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005446 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-2-build" event={"ID":"7c3be06c-b88a-4749-b788-876b92486d65","Type":"ContainerDied","Data":"1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60"} Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005503 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b68bd195cf6bb7905a5718f20102eac43e6b9f21f64badfb9c94c5272dc5c60" Mar 19 00:25:53 crc kubenswrapper[4745]: I0319 00:25:53.005528 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-2-build" Mar 19 00:25:54 crc kubenswrapper[4745]: I0319 00:25:54.126851 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7c3be06c-b88a-4749-b788-876b92486d65" (UID: "7c3be06c-b88a-4749-b788-876b92486d65"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:25:54 crc kubenswrapper[4745]: I0319 00:25:54.190280 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7c3be06c-b88a-4749-b788-876b92486d65-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.960418 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961131 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="git-clone" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961145 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="git-clone" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961162 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961168 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961178 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961185 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961200 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961206 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: E0319 00:25:56.961217 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961223 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="manage-dockerfile" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961323 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c3be06c-b88a-4749-b788-876b92486d65" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.961339 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="0383e703-f206-4571-8ca3-be59433df02c" containerName="docker-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.962041 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964238 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-sys-config" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964556 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-ca" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.964714 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-1-global-ca" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.965084 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:25:56 crc kubenswrapper[4745]: I0319 00:25:56.986152 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071407 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071520 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071539 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071569 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.071848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072035 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072069 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072101 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072200 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.072283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174108 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174179 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174208 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174245 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174459 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174506 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174530 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174603 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174631 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.174659 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175016 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175193 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175476 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175494 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175533 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.175950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176307 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176416 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.176081 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.182732 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.182799 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.192653 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"sg-core-1-build\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.280934 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:25:57 crc kubenswrapper[4745]: I0319 00:25:57.524245 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:25:58 crc kubenswrapper[4745]: I0319 00:25:58.037915 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerStarted","Data":"7e76a7bf4434b7b734c027315e7c4db3ec75ff14cdec4eb02925ef312367c87c"} Mar 19 00:25:59 crc kubenswrapper[4745]: I0319 00:25:59.046190 4745 generic.go:334] "Generic (PLEG): container finished" podID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" exitCode=0 Mar 19 00:25:59 crc kubenswrapper[4745]: I0319 00:25:59.046307 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226"} Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.062661 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerStarted","Data":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.088076 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-1-build" podStartSLOduration=4.088050331 podStartE2EDuration="4.088050331s" podCreationTimestamp="2026-03-19 00:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:26:00.087026009 +0000 UTC m=+1124.625221170" watchObservedRunningTime="2026-03-19 00:26:00.088050331 +0000 UTC m=+1124.626245462" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.137057 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.138440 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.140847 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.141541 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.143107 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.161471 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.320536 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.422634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.451396 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"auto-csr-approver-29564666-lrgz2\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.493964 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:00 crc kubenswrapper[4745]: I0319 00:26:00.716289 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:26:01 crc kubenswrapper[4745]: I0319 00:26:01.073600 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerStarted","Data":"d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973"} Mar 19 00:26:02 crc kubenswrapper[4745]: I0319 00:26:02.091411 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerStarted","Data":"6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e"} Mar 19 00:26:03 crc kubenswrapper[4745]: I0319 00:26:03.099867 4745 generic.go:334] "Generic (PLEG): container finished" podID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerID="6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e" exitCode=0 Mar 19 00:26:03 crc kubenswrapper[4745]: I0319 00:26:03.099954 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerDied","Data":"6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e"} Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.345331 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.486904 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") pod \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\" (UID: \"6fad60f0-0471-47eb-af8b-85d8a4a0c52f\") " Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.494152 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx" (OuterVolumeSpecName: "kube-api-access-k5jzx") pod "6fad60f0-0471-47eb-af8b-85d8a4a0c52f" (UID: "6fad60f0-0471-47eb-af8b-85d8a4a0c52f"). InnerVolumeSpecName "kube-api-access-k5jzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:26:04 crc kubenswrapper[4745]: I0319 00:26:04.588141 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5jzx\" (UniqueName: \"kubernetes.io/projected/6fad60f0-0471-47eb-af8b-85d8a4a0c52f-kube-api-access-k5jzx\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113032 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" event={"ID":"6fad60f0-0471-47eb-af8b-85d8a4a0c52f","Type":"ContainerDied","Data":"d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973"} Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113092 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d97390ec66de1b560d581631acdabd76a6dafd4e1f3aabdca27e817c5b8cc973" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.113111 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564666-lrgz2" Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.169608 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:26:05 crc kubenswrapper[4745]: I0319 00:26:05.174676 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564660-t5gfq"] Mar 19 00:26:06 crc kubenswrapper[4745]: I0319 00:26:06.147142 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="559d4ca4-399c-4504-8358-69d88bfdaf3a" path="/var/lib/kubelet/pods/559d4ca4-399c-4504-8358-69d88bfdaf3a/volumes" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.300797 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.301665 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/sg-core-1-build" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" containerID="cri-o://e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" gracePeriod=30 Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.701721 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_6c61d7a4-4470-4cbd-94f5-512619e989f6/docker-build/0.log" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.703125 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835811 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835876 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835908 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835959 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.835987 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836068 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836094 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836235 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836264 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836293 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836343 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") pod \"6c61d7a4-4470-4cbd-94f5-512619e989f6\" (UID: \"6c61d7a4-4470-4cbd-94f5-512619e989f6\") " Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836399 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836868 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.836914 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.838830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.839158 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.839619 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.840243 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.840583 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843454 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88" (OuterVolumeSpecName: "kube-api-access-97p88") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "kube-api-access-97p88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.843830 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937793 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937840 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937863 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/6c61d7a4-4470-4cbd-94f5-512619e989f6-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937897 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937911 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937920 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937929 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.937940 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-97p88\" (UniqueName: \"kubernetes.io/projected/6c61d7a4-4470-4cbd-94f5-512619e989f6-kube-api-access-97p88\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.940352 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:07 crc kubenswrapper[4745]: I0319 00:26:07.952786 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "6c61d7a4-4470-4cbd-94f5-512619e989f6" (UID: "6c61d7a4-4470-4cbd-94f5-512619e989f6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.039275 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.039355 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/6c61d7a4-4470-4cbd-94f5-512619e989f6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.135352 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-1-build_6c61d7a4-4470-4cbd-94f5-512619e989f6/docker-build/0.log" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.135966 4745 generic.go:334] "Generic (PLEG): container finished" podID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" exitCode=1 Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136026 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136079 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-1-build" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136137 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-1-build" event={"ID":"6c61d7a4-4470-4cbd-94f5-512619e989f6","Type":"ContainerDied","Data":"7e76a7bf4434b7b734c027315e7c4db3ec75ff14cdec4eb02925ef312367c87c"} Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.136166 4745 scope.go:117] "RemoveContainer" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.171261 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.179096 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-core-1-build"] Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.186510 4745 scope.go:117] "RemoveContainer" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.205416 4745 scope.go:117] "RemoveContainer" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: E0319 00:26:08.205944 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": container with ID starting with e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18 not found: ID does not exist" containerID="e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.206010 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18"} err="failed to get container status \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": rpc error: code = NotFound desc = could not find container \"e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18\": container with ID starting with e329da2044c3ef43c6b55100b12aa9033c5174e8414ae4603f8a44b6fb559c18 not found: ID does not exist" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.206049 4745 scope.go:117] "RemoveContainer" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: E0319 00:26:08.206961 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": container with ID starting with eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226 not found: ID does not exist" containerID="eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226" Mar 19 00:26:08 crc kubenswrapper[4745]: I0319 00:26:08.207030 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226"} err="failed to get container status \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": rpc error: code = NotFound desc = could not find container \"eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226\": container with ID starting with eb84edb075ac8eaed86d08b8865e3e689268eb507062b28c8aa019cc92d2a226 not found: ID does not exist" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021517 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021852 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="manage-dockerfile" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021870 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="manage-dockerfile" Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021923 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021931 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: E0319 00:26:09.021941 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.021950 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.022072 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" containerName="docker-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.022090 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" containerName="oc" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.023172 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026104 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-ca" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026344 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-sys-config" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.026637 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-core-2-global-ca" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.030335 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.049183 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053088 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053162 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053297 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053348 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053443 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053567 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053660 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053807 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053840 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053898 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.053983 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156657 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.156801 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157032 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157071 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157156 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157180 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157214 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157217 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157248 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157275 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157306 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157385 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157455 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157486 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157686 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157937 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157986 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.157992 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.158151 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.158943 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.162435 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.173350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.174108 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"sg-core-2-build\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.338956 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:26:09 crc kubenswrapper[4745]: I0319 00:26:09.561288 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-core-2-build"] Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.146424 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c61d7a4-4470-4cbd-94f5-512619e989f6" path="/var/lib/kubelet/pods/6c61d7a4-4470-4cbd-94f5-512619e989f6/volumes" Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.157580 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d"} Mar 19 00:26:10 crc kubenswrapper[4745]: I0319 00:26:10.157630 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9"} Mar 19 00:26:11 crc kubenswrapper[4745]: I0319 00:26:11.165923 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d" exitCode=0 Mar 19 00:26:11 crc kubenswrapper[4745]: I0319 00:26:11.166002 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"6af6d11839f61cc5eb6c4ada548e09fae08633a6e79b3849dd1445a55e9baf3d"} Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.177865 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="409ec20be66c2df87bfc00e1e2571b2f14c5260bc018c8103a5d083b4bbb413a" exitCode=0 Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.178259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"409ec20be66c2df87bfc00e1e2571b2f14c5260bc018c8103a5d083b4bbb413a"} Mar 19 00:26:12 crc kubenswrapper[4745]: I0319 00:26:12.219645 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-core-2-build_534b93f2-ab59-4958-9374-29c114fab497/manage-dockerfile/0.log" Mar 19 00:26:13 crc kubenswrapper[4745]: I0319 00:26:13.188655 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerStarted","Data":"364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2"} Mar 19 00:26:13 crc kubenswrapper[4745]: I0319 00:26:13.218439 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-core-2-build" podStartSLOduration=5.218420379 podStartE2EDuration="5.218420379s" podCreationTimestamp="2026-03-19 00:26:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:26:13.214434236 +0000 UTC m=+1137.752629387" watchObservedRunningTime="2026-03-19 00:26:13.218420379 +0000 UTC m=+1137.756615510" Mar 19 00:26:35 crc kubenswrapper[4745]: I0319 00:26:35.218925 4745 scope.go:117] "RemoveContainer" containerID="1663b5c8bcd4ae3a664653728fe6c21020e126b8db8f2cf94f1cfba9c6c7bbc2" Mar 19 00:26:45 crc kubenswrapper[4745]: I0319 00:26:45.606005 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:26:45 crc kubenswrapper[4745]: I0319 00:26:45.607062 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:15 crc kubenswrapper[4745]: I0319 00:27:15.606476 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:27:15 crc kubenswrapper[4745]: I0319 00:27:15.607147 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.606013 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.606984 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.607070 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.607964 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.608063 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" gracePeriod=600 Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819280 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" exitCode=0 Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819382 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89"} Mar 19 00:27:45 crc kubenswrapper[4745]: I0319 00:27:45.819764 4745 scope.go:117] "RemoveContainer" containerID="b7b021cd8b07360e8af6249aac1835e212578d41089c112a8709760bee2deb06" Mar 19 00:27:46 crc kubenswrapper[4745]: I0319 00:27:46.830524 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.146819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.148763 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.153032 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.153309 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.154112 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.154413 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.232510 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.334390 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.355819 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"auto-csr-approver-29564668-66fs6\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:00 crc kubenswrapper[4745]: I0319 00:28:00.472729 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.035866 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:28:01 crc kubenswrapper[4745]: W0319 00:28:01.038435 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9ad0116_35eb_40db_8d57_4501affdf59c.slice/crio-c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702 WatchSource:0}: Error finding container c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702: Status 404 returned error can't find the container with id c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702 Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.041738 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:28:01 crc kubenswrapper[4745]: I0319 00:28:01.938393 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerStarted","Data":"c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702"} Mar 19 00:28:02 crc kubenswrapper[4745]: I0319 00:28:02.947986 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerID="08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17" exitCode=0 Mar 19 00:28:02 crc kubenswrapper[4745]: I0319 00:28:02.948122 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerDied","Data":"08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17"} Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.218728 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.307672 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") pod \"c9ad0116-35eb-40db-8d57-4501affdf59c\" (UID: \"c9ad0116-35eb-40db-8d57-4501affdf59c\") " Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.316005 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t" (OuterVolumeSpecName: "kube-api-access-ftw9t") pod "c9ad0116-35eb-40db-8d57-4501affdf59c" (UID: "c9ad0116-35eb-40db-8d57-4501affdf59c"). InnerVolumeSpecName "kube-api-access-ftw9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.409618 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftw9t\" (UniqueName: \"kubernetes.io/projected/c9ad0116-35eb-40db-8d57-4501affdf59c-kube-api-access-ftw9t\") on node \"crc\" DevicePath \"\"" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.961803 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564668-66fs6" event={"ID":"c9ad0116-35eb-40db-8d57-4501affdf59c","Type":"ContainerDied","Data":"c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702"} Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.961854 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2272424c5ada381be349899a76801cce4e35bded259fe03ed1169b562e94702" Mar 19 00:28:04 crc kubenswrapper[4745]: I0319 00:28:04.962512 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564668-66fs6" Mar 19 00:28:05 crc kubenswrapper[4745]: I0319 00:28:05.384266 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:28:05 crc kubenswrapper[4745]: I0319 00:28:05.390028 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564662-znhfd"] Mar 19 00:28:06 crc kubenswrapper[4745]: I0319 00:28:06.145363 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="deed3a0c-ada3-41b5-895b-8acc45926539" path="/var/lib/kubelet/pods/deed3a0c-ada3-41b5-895b-8acc45926539/volumes" Mar 19 00:28:35 crc kubenswrapper[4745]: I0319 00:28:35.302355 4745 scope.go:117] "RemoveContainer" containerID="7d29ab0663977a94ba5c0f15b3cbd0ce7ec172f2fc28bc0ca2d89b44013b1e84" Mar 19 00:29:40 crc kubenswrapper[4745]: I0319 00:29:40.839079 4745 generic.go:334] "Generic (PLEG): container finished" podID="534b93f2-ab59-4958-9374-29c114fab497" containerID="364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2" exitCode=0 Mar 19 00:29:40 crc kubenswrapper[4745]: I0319 00:29:40.839164 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"364a672dbba13054a5bf498f730b5775c59edbc1f42527c6d8984982a4ab68c2"} Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.080454 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172286 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172376 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172412 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172436 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172477 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172531 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172600 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172653 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172692 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") pod \"534b93f2-ab59-4958-9374-29c114fab497\" (UID: \"534b93f2-ab59-4958-9374-29c114fab497\") " Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.172739 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173005 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173040 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.173521 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.174514 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.175154 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.175166 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180075 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf" (OuterVolumeSpecName: "kube-api-access-l7lvf") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "kube-api-access-l7lvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180078 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.180721 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.185357 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.274637 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275090 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275176 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275242 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275303 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/534b93f2-ab59-4958-9374-29c114fab497-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275359 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275433 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l7lvf\" (UniqueName: \"kubernetes.io/projected/534b93f2-ab59-4958-9374-29c114fab497-kube-api-access-l7lvf\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275489 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/534b93f2-ab59-4958-9374-29c114fab497-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.275541 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/534b93f2-ab59-4958-9374-29c114fab497-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.589025 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.682614 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-core-2-build" event={"ID":"534b93f2-ab59-4958-9374-29c114fab497","Type":"ContainerDied","Data":"7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9"} Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857753 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7aed128a4133bb08f4b1c5a4857236ca0965b3bd6da79c7f58fb80cc2d5df0f9" Mar 19 00:29:42 crc kubenswrapper[4745]: I0319 00:29:42.857848 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-core-2-build" Mar 19 00:29:44 crc kubenswrapper[4745]: I0319 00:29:44.825305 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "534b93f2-ab59-4958-9374-29c114fab497" (UID: "534b93f2-ab59-4958-9374-29c114fab497"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:44 crc kubenswrapper[4745]: I0319 00:29:44.919677 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/534b93f2-ab59-4958-9374-29c114fab497-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:45 crc kubenswrapper[4745]: I0319 00:29:45.606219 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:29:45 crc kubenswrapper[4745]: I0319 00:29:45.606693 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.376684 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377024 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="git-clone" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377041 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="git-clone" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377060 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="manage-dockerfile" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377067 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="manage-dockerfile" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377088 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377098 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: E0319 00:29:47.377106 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377112 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377213 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" containerName="oc" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377224 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="534b93f2-ab59-4958-9374-29c114fab497" containerName="docker-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.377963 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.380392 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.380420 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-ca" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.381264 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-global-ca" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.382181 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-1-sys-config" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.401824 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435507 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435597 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435614 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435635 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435650 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435667 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435685 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435702 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435900 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.435978 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.436031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536800 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536853 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536871 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536944 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.536975 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537013 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537034 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537099 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537128 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537148 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537171 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537234 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537570 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.537970 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.538158 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540062 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540588 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.540973 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.541377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.542120 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.547658 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.548067 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.557091 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"sg-bridge-1-build\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.694381 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.888161 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:47 crc kubenswrapper[4745]: I0319 00:29:47.931433 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerStarted","Data":"4c71a05e40b649b82c93fa62486b6094516be105dfaccb79ed3cf959fdf04635"} Mar 19 00:29:48 crc kubenswrapper[4745]: I0319 00:29:48.942615 4745 generic.go:334] "Generic (PLEG): container finished" podID="107067a8-8942-4ede-9614-121991e06616" containerID="649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b" exitCode=0 Mar 19 00:29:48 crc kubenswrapper[4745]: I0319 00:29:48.942692 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerDied","Data":"649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b"} Mar 19 00:29:49 crc kubenswrapper[4745]: I0319 00:29:49.957333 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerStarted","Data":"ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b"} Mar 19 00:29:49 crc kubenswrapper[4745]: I0319 00:29:49.992023 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-1-build" podStartSLOduration=2.991998218 podStartE2EDuration="2.991998218s" podCreationTimestamp="2026-03-19 00:29:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:29:49.991838523 +0000 UTC m=+1354.530033664" watchObservedRunningTime="2026-03-19 00:29:49.991998218 +0000 UTC m=+1354.530193349" Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.002942 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.003993 4745 generic.go:334] "Generic (PLEG): container finished" podID="107067a8-8942-4ede-9614-121991e06616" containerID="ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b" exitCode=1 Mar 19 00:29:56 crc kubenswrapper[4745]: I0319 00:29:56.004048 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-1-build" event={"ID":"107067a8-8942-4ede-9614-121991e06616","Type":"ContainerDied","Data":"ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b"} Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.263471 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.264554 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392721 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392835 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392935 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392965 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.392985 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393026 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393036 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393071 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393201 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393287 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393325 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") pod \"107067a8-8942-4ede-9614-121991e06616\" (UID: \"107067a8-8942-4ede-9614-121991e06616\") " Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393796 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393831 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.393959 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.394538 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.394573 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.395039 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.395275 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401092 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.401108 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq" (OuterVolumeSpecName: "kube-api-access-p6fbq") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "kube-api-access-p6fbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.485471 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.495904 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496332 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/107067a8-8942-4ede-9614-121991e06616-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496343 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496352 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496361 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496369 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6fbq\" (UniqueName: \"kubernetes.io/projected/107067a8-8942-4ede-9614-121991e06616-kube-api-access-p6fbq\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496377 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496385 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/107067a8-8942-4ede-9614-121991e06616-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496393 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.496404 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/107067a8-8942-4ede-9614-121991e06616-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.760864 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "107067a8-8942-4ede-9614-121991e06616" (UID: "107067a8-8942-4ede-9614-121991e06616"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.801230 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/107067a8-8942-4ede-9614-121991e06616-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.833561 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:57 crc kubenswrapper[4745]: I0319 00:29:57.839203 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/sg-bridge-1-build"] Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.019493 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-1-build_107067a8-8942-4ede-9614-121991e06616/docker-build/0.log" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.019964 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c71a05e40b649b82c93fa62486b6094516be105dfaccb79ed3cf959fdf04635" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.020079 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-1-build" Mar 19 00:29:58 crc kubenswrapper[4745]: I0319 00:29:58.147024 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107067a8-8942-4ede-9614-121991e06616" path="/var/lib/kubelet/pods/107067a8-8942-4ede-9614-121991e06616/volumes" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.529819 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:29:59 crc kubenswrapper[4745]: E0319 00:29:59.530144 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="manage-dockerfile" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530157 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="manage-dockerfile" Mar 19 00:29:59 crc kubenswrapper[4745]: E0319 00:29:59.530179 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530186 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.530307 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="107067a8-8942-4ede-9614-121991e06616" containerName="docker-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.531253 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.533753 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-global-ca" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534144 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534227 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-sys-config" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.534188 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"sg-bridge-2-ca" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.550072 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.629674 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630283 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630306 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630363 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630390 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630508 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630569 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630658 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630700 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630770 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630799 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.630874 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733008 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733095 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733125 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733143 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733165 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733189 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733209 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733228 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733261 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733286 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733337 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733739 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733771 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.733732 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734009 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734257 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.734250 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.735522 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.748179 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.751507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.754350 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"sg-bridge-2-build\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " pod="service-telemetry/sg-bridge-2-build" Mar 19 00:29:59 crc kubenswrapper[4745]: I0319 00:29:59.847398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.152013 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.152959 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.155917 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.155982 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.158627 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.239093 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.240235 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.243870 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244010 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244069 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244103 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244468 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.244711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.252131 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.257761 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.292793 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/sg-bridge-2-build"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345163 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.345310 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.346122 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.355391 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.364145 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"auto-csr-approver-29564670-8sw74\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.364271 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"collect-profiles-29564670-ps2vr\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.479095 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.571195 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.691749 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr"] Mar 19 00:30:00 crc kubenswrapper[4745]: I0319 00:30:00.800484 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:30:00 crc kubenswrapper[4745]: W0319 00:30:00.807977 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a566f97_13b9_4fde_868a_f55bd82a1af6.slice/crio-94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995 WatchSource:0}: Error finding container 94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995: Status 404 returned error can't find the container with id 94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995 Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.066296 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerStarted","Data":"94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.068896 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerStarted","Data":"f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.068945 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerStarted","Data":"a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.071088 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.071152 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca"} Mar 19 00:30:01 crc kubenswrapper[4745]: I0319 00:30:01.086594 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" podStartSLOduration=1.086571323 podStartE2EDuration="1.086571323s" podCreationTimestamp="2026-03-19 00:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:01.082472584 +0000 UTC m=+1365.620667715" watchObservedRunningTime="2026-03-19 00:30:01.086571323 +0000 UTC m=+1365.624766454" Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.080997 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428" exitCode=0 Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.081109 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"76b872a9c674840046cfe8380bd40ec0bac4b1772b00ad0e5157a9592cf2c428"} Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.085346 4745 generic.go:334] "Generic (PLEG): container finished" podID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerID="f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2" exitCode=0 Mar 19 00:30:02 crc kubenswrapper[4745]: I0319 00:30:02.085389 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerDied","Data":"f162a6529bd91b5afff5c3c3d61fb698e5e267339659205f4beb33078fc998f2"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.094370 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="c3e0a9b53a8a5c2e01dc0c2e7771eef3526e9f2a8d106e37fc3e542da15fa712" exitCode=0 Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.094449 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"c3e0a9b53a8a5c2e01dc0c2e7771eef3526e9f2a8d106e37fc3e542da15fa712"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.097101 4745 generic.go:334] "Generic (PLEG): container finished" podID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerID="e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5" exitCode=0 Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.097186 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerDied","Data":"e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5"} Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.157244 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_sg-bridge-2-build_750c31ab-bd58-4423-bb43-45dccd385cab/manage-dockerfile/0.log" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.413349 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498051 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498198 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.498253 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") pod \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\" (UID: \"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848\") " Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.499256 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume" (OuterVolumeSpecName: "config-volume") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.505701 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d" (OuterVolumeSpecName: "kube-api-access-pmp5d") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "kube-api-access-pmp5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.506646 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" (UID: "d9d09ccf-39bb-4eb2-8b3b-96338b5b9848"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.599887 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.600257 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmp5d\" (UniqueName: \"kubernetes.io/projected/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-kube-api-access-pmp5d\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:03 crc kubenswrapper[4745]: I0319 00:30:03.600341 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/d9d09ccf-39bb-4eb2-8b3b-96338b5b9848-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.106630 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.107000 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564670-ps2vr" event={"ID":"d9d09ccf-39bb-4eb2-8b3b-96338b5b9848","Type":"ContainerDied","Data":"a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246"} Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.107502 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a4a63506cc075bbeba17c49e6c84f7af721e430f03dba25dbfdcb03fbd700246" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.109067 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerStarted","Data":"b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403"} Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.145527 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/sg-bridge-2-build" podStartSLOduration=5.145505605 podStartE2EDuration="5.145505605s" podCreationTimestamp="2026-03-19 00:29:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:04.143651498 +0000 UTC m=+1368.681846629" watchObservedRunningTime="2026-03-19 00:30:04.145505605 +0000 UTC m=+1368.683700736" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.364870 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.509549 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") pod \"9a566f97-13b9-4fde-868a-f55bd82a1af6\" (UID: \"9a566f97-13b9-4fde-868a-f55bd82a1af6\") " Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.515308 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff" (OuterVolumeSpecName: "kube-api-access-tzdff") pod "9a566f97-13b9-4fde-868a-f55bd82a1af6" (UID: "9a566f97-13b9-4fde-868a-f55bd82a1af6"). InnerVolumeSpecName "kube-api-access-tzdff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:04 crc kubenswrapper[4745]: I0319 00:30:04.611269 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzdff\" (UniqueName: \"kubernetes.io/projected/9a566f97-13b9-4fde-868a-f55bd82a1af6-kube-api-access-tzdff\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117187 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564670-8sw74" event={"ID":"9a566f97-13b9-4fde-868a-f55bd82a1af6","Type":"ContainerDied","Data":"94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995"} Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117240 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94124bc26fea9877d2754a86a07c8c0716612e1d40c4900162d26957454a2995" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.117239 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564670-8sw74" Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.426211 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:30:05 crc kubenswrapper[4745]: I0319 00:30:05.433498 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564664-hhtnq"] Mar 19 00:30:06 crc kubenswrapper[4745]: I0319 00:30:06.145794 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9a8819f-c57d-463c-9089-fbf3b29e12bc" path="/var/lib/kubelet/pods/d9a8819f-c57d-463c-9089-fbf3b29e12bc/volumes" Mar 19 00:30:15 crc kubenswrapper[4745]: I0319 00:30:15.606973 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:30:15 crc kubenswrapper[4745]: I0319 00:30:15.607945 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:30:35 crc kubenswrapper[4745]: I0319 00:30:35.376707 4745 scope.go:117] "RemoveContainer" containerID="102c360c5a32588e5b65407ade841670a288ba3942421d8abc329207a20bc972" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.605977 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.606593 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.606656 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.608003 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:30:45 crc kubenswrapper[4745]: I0319 00:30:45.608077 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" gracePeriod=600 Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408039 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" exitCode=0 Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5"} Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408539 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} Mar 19 00:30:46 crc kubenswrapper[4745]: I0319 00:30:46.408571 4745 scope.go:117] "RemoveContainer" containerID="b78af311b5799c4bf0cb1c94c0266d9488b1808ed2c9ef987dfc6da2fa0eaa89" Mar 19 00:30:47 crc kubenswrapper[4745]: I0319 00:30:47.418644 4745 generic.go:334] "Generic (PLEG): container finished" podID="750c31ab-bd58-4423-bb43-45dccd385cab" containerID="b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403" exitCode=0 Mar 19 00:30:47 crc kubenswrapper[4745]: I0319 00:30:47.418716 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"b6a387acf56a10a05c81ffb80083f81d17823aeec4236abe8ff34638d4305403"} Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.714570 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785004 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785055 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785104 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785131 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785164 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785185 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785230 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785261 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785306 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785430 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785488 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") pod \"750c31ab-bd58-4423-bb43-45dccd385cab\" (UID: \"750c31ab-bd58-4423-bb43-45dccd385cab\") " Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785737 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.785975 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786393 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786446 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786476 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.786563 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.788657 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.789270 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.794626 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.797009 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.802532 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z" (OuterVolumeSpecName: "kube-api-access-2bj7z") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "kube-api-access-2bj7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888088 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888167 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888189 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888209 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/750c31ab-bd58-4423-bb43-45dccd385cab-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888225 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/750c31ab-bd58-4423-bb43-45dccd385cab-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888240 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bj7z\" (UniqueName: \"kubernetes.io/projected/750c31ab-bd58-4423-bb43-45dccd385cab-kube-api-access-2bj7z\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888256 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888302 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.888318 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/750c31ab-bd58-4423-bb43-45dccd385cab-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.918133 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:48 crc kubenswrapper[4745]: I0319 00:30:48.989979 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/sg-bridge-2-build" event={"ID":"750c31ab-bd58-4423-bb43-45dccd385cab","Type":"ContainerDied","Data":"8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca"} Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443760 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8063e75c0220463aa8c8f5be769f651202ef2ad90cb60349d106fdd478eddfca" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.443902 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/sg-bridge-2-build" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.559422 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "750c31ab-bd58-4423-bb43-45dccd385cab" (UID: "750c31ab-bd58-4423-bb43-45dccd385cab"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:30:49 crc kubenswrapper[4745]: I0319 00:30:49.601207 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/750c31ab-bd58-4423-bb43-45dccd385cab-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.881799 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882776 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="manage-dockerfile" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882790 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="manage-dockerfile" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882803 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882809 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882818 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="git-clone" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882825 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="git-clone" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882836 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882841 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: E0319 00:30:53.882853 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882858 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882986 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="750c31ab-bd58-4423-bb43-45dccd385cab" containerName="docker-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.882999 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d09ccf-39bb-4eb2-8b3b-96338b5b9848" containerName="collect-profiles" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.883011 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" containerName="oc" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.883659 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.885859 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-sys-config" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.886386 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-global-ca" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.886835 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.887243 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-1-ca" Mar 19 00:30:53 crc kubenswrapper[4745]: I0319 00:30:53.899132 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062158 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062684 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062714 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062793 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062830 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062862 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.062901 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063063 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063117 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063160 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063187 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.063346 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164294 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164363 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164382 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164406 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164432 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164454 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164496 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164527 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164557 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164589 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164643 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164645 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.164584 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.165110 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.165338 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166556 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166794 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.166941 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.167106 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.171375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.179599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.187702 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"prometheus-webhook-snmp-1-build\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.203669 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:30:54 crc kubenswrapper[4745]: I0319 00:30:54.606692 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.485534 4745 generic.go:334] "Generic (PLEG): container finished" podID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerID="8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908" exitCode=0 Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.485660 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908"} Mar 19 00:30:55 crc kubenswrapper[4745]: I0319 00:30:55.486019 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerStarted","Data":"d99deadfe1fdaec34e88fd1333d4bf6a3807b8f55f4f7cd2551dba743f9ec241"} Mar 19 00:30:56 crc kubenswrapper[4745]: I0319 00:30:56.502375 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerStarted","Data":"81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7"} Mar 19 00:30:56 crc kubenswrapper[4745]: I0319 00:30:56.530821 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-1-build" podStartSLOduration=3.530797326 podStartE2EDuration="3.530797326s" podCreationTimestamp="2026-03-19 00:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:30:56.530281669 +0000 UTC m=+1421.068476810" watchObservedRunningTime="2026-03-19 00:30:56.530797326 +0000 UTC m=+1421.068992457" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.287734 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.289342 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/prometheus-webhook-snmp-1-build" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" containerID="cri-o://81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" gracePeriod=30 Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.552211 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.553112 4745 generic.go:334] "Generic (PLEG): container finished" podID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerID="81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" exitCode=1 Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.553172 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7"} Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.689768 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.690197 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718420 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718562 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718621 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718660 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718703 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718725 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718757 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718805 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718833 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718864 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718929 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.718983 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") pod \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\" (UID: \"cba4406e-8dc7-4dc5-a7c0-7778f01d1028\") " Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719282 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719398 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.720610 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.719819 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.736859 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.737085 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.737125 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg" (OuterVolumeSpecName: "kube-api-access-sdcqg") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "kube-api-access-sdcqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.796684 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820573 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820658 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820672 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820681 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820690 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820699 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820743 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820756 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820765 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sdcqg\" (UniqueName: \"kubernetes.io/projected/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-kube-api-access-sdcqg\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820795 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:04 crc kubenswrapper[4745]: I0319 00:31:04.820804 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.120779 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "cba4406e-8dc7-4dc5-a7c0-7778f01d1028" (UID: "cba4406e-8dc7-4dc5-a7c0-7778f01d1028"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.124411 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/cba4406e-8dc7-4dc5-a7c0-7778f01d1028-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.562388 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-1-build_cba4406e-8dc7-4dc5-a7c0-7778f01d1028/docker-build/0.log" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563009 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-1-build" event={"ID":"cba4406e-8dc7-4dc5-a7c0-7778f01d1028","Type":"ContainerDied","Data":"d99deadfe1fdaec34e88fd1333d4bf6a3807b8f55f4f7cd2551dba743f9ec241"} Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563070 4745 scope.go:117] "RemoveContainer" containerID="81f9b5877c26083562eeca7126b020fa88cfcf8ed70c30b3b2a8bcc49ea467a7" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.563304 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-1-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.592065 4745 scope.go:117] "RemoveContainer" containerID="8e243def335e49da6a1d57cd97c25f6d3979c96c8c683baa98a92a5cca110908" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.619249 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.623639 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-1-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.992973 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:05 crc kubenswrapper[4745]: E0319 00:31:05.993525 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.993545 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: E0319 00:31:05.993557 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="manage-dockerfile" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.993564 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="manage-dockerfile" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.994657 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" containerName="docker-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.997066 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:05 crc kubenswrapper[4745]: I0319 00:31:05.999193 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-sys-config" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.000967 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.001269 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-global-ca" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.002476 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-webhook-snmp-2-ca" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.015717 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040335 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040419 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040479 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040516 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040553 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040586 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040654 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040681 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040711 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040741 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.040987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.041113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143094 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143144 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143185 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143217 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143277 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143299 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143319 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143338 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143357 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143377 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143399 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143427 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143456 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143574 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.143947 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144338 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144352 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144480 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.144853 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.145230 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.146152 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba4406e-8dc7-4dc5-a7c0-7778f01d1028" path="/var/lib/kubelet/pods/cba4406e-8dc7-4dc5-a7c0-7778f01d1028/volumes" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.148950 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.151233 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.167094 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"prometheus-webhook-snmp-2-build\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.318290 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.526686 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-webhook-snmp-2-build"] Mar 19 00:31:06 crc kubenswrapper[4745]: W0319 00:31:06.533789 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3cc8a26_ff02_4e22_b924_7ba0a0bf0cdb.slice/crio-8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6 WatchSource:0}: Error finding container 8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6: Status 404 returned error can't find the container with id 8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6 Mar 19 00:31:06 crc kubenswrapper[4745]: I0319 00:31:06.573355 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6"} Mar 19 00:31:07 crc kubenswrapper[4745]: I0319 00:31:07.583262 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462"} Mar 19 00:31:08 crc kubenswrapper[4745]: I0319 00:31:08.617969 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462" exitCode=0 Mar 19 00:31:08 crc kubenswrapper[4745]: I0319 00:31:08.618037 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"12de399145ebf4351ca99846af1271e3bc7ba9b124e5e71af2424115e30a6462"} Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.632309 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="6caa75a3508ec6bf1e70827ac043333a39f59633de038b233135fa516bdf907e" exitCode=0 Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.632439 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"6caa75a3508ec6bf1e70827ac043333a39f59633de038b233135fa516bdf907e"} Mar 19 00:31:09 crc kubenswrapper[4745]: I0319 00:31:09.684433 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-webhook-snmp-2-build_b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb/manage-dockerfile/0.log" Mar 19 00:31:10 crc kubenswrapper[4745]: I0319 00:31:10.640857 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerStarted","Data":"183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206"} Mar 19 00:31:10 crc kubenswrapper[4745]: I0319 00:31:10.675573 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-webhook-snmp-2-build" podStartSLOduration=5.675542542 podStartE2EDuration="5.675542542s" podCreationTimestamp="2026-03-19 00:31:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:31:10.670850145 +0000 UTC m=+1435.209045286" watchObservedRunningTime="2026-03-19 00:31:10.675542542 +0000 UTC m=+1435.213737693" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.145984 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.147394 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.151634 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.151800 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.152772 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.154118 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.168234 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.270634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.291680 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"auto-csr-approver-29564672-tpcgv\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.470850 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:00 crc kubenswrapper[4745]: I0319 00:32:00.707916 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:32:01 crc kubenswrapper[4745]: I0319 00:32:01.306059 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerStarted","Data":"0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9"} Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.313959 4745 generic.go:334] "Generic (PLEG): container finished" podID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerID="ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb" exitCode=0 Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.314013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerDied","Data":"ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb"} Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.316991 4745 generic.go:334] "Generic (PLEG): container finished" podID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerID="183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206" exitCode=0 Mar 19 00:32:02 crc kubenswrapper[4745]: I0319 00:32:02.317030 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"183bda858c89f1750355a842d8b2c90e06898d12268cec8ffa2340518a400206"} Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.689354 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.693010 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.784968 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785054 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785090 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785145 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785173 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785191 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785249 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785272 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785325 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785347 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785371 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785423 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") pod \"550c50ae-5519-4c0d-b2b0-7415d134808f\" (UID: \"550c50ae-5519-4c0d-b2b0-7415d134808f\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785447 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") pod \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\" (UID: \"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb\") " Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.785789 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786367 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786443 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786860 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.786893 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.787215 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.788554 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.792370 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj" (OuterVolumeSpecName: "kube-api-access-6nvmj") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "kube-api-access-6nvmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.793283 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.794549 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.815892 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f" (OuterVolumeSpecName: "kube-api-access-c2w9f") pod "550c50ae-5519-4c0d-b2b0-7415d134808f" (UID: "550c50ae-5519-4c0d-b2b0-7415d134808f"). InnerVolumeSpecName "kube-api-access-c2w9f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886910 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6nvmj\" (UniqueName: \"kubernetes.io/projected/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-kube-api-access-6nvmj\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886950 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886962 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886975 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886986 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.886995 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887003 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887011 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887020 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.887029 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2w9f\" (UniqueName: \"kubernetes.io/projected/550c50ae-5519-4c0d-b2b0-7415d134808f-kube-api-access-c2w9f\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.903763 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:03 crc kubenswrapper[4745]: I0319 00:32:03.988793 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335328 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-webhook-snmp-2-build" event={"ID":"b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb","Type":"ContainerDied","Data":"8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6"} Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335690 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8045e150559a823c4c3cf44e98e6b35528c4a4d13121e74266e16e52f4f60ab6" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.335405 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-webhook-snmp-2-build" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337408 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" event={"ID":"550c50ae-5519-4c0d-b2b0-7415d134808f","Type":"ContainerDied","Data":"0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9"} Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337437 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0748e92f88b6a9584f3470bc17753ccc9bd18b6a77cecd36fb08ab3c8f42cac9" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.337498 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564672-tpcgv" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.676212 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" (UID: "b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.703013 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.786073 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:32:04 crc kubenswrapper[4745]: I0319 00:32:04.791932 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564666-lrgz2"] Mar 19 00:32:06 crc kubenswrapper[4745]: I0319 00:32:06.146117 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fad60f0-0471-47eb-af8b-85d8a4a0c52f" path="/var/lib/kubelet/pods/6fad60f0-0471-47eb-af8b-85d8a4a0c52f/volumes" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.380907 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381866 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381897 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381916 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="git-clone" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381921 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="git-clone" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381930 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="manage-dockerfile" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381938 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="manage-dockerfile" Mar 19 00:32:10 crc kubenswrapper[4745]: E0319 00:32:10.381951 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.381956 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.382057 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3cc8a26-ff02-4e22-b924-7ba0a0bf0cdb" containerName="docker-build" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.382072 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" containerName="oc" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.383009 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.384945 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.385028 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.385096 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.392545 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.485945 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486023 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486070 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.486754 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.487034 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.512906 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"redhat-operators-cwrwg\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:10 crc kubenswrapper[4745]: I0319 00:32:10.703515 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:11 crc kubenswrapper[4745]: I0319 00:32:11.111062 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:11 crc kubenswrapper[4745]: I0319 00:32:11.382099 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"9f4642b49f8bb7f219967f76570586d6870f2fbf5d9b181304387179ff6fbf3b"} Mar 19 00:32:12 crc kubenswrapper[4745]: I0319 00:32:12.391259 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" exitCode=0 Mar 19 00:32:12 crc kubenswrapper[4745]: I0319 00:32:12.391321 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8"} Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.026771 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.028286 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030447 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-sys-config" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030670 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030711 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-ca" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.030765 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-1-global-ca" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.056091 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134753 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134897 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134916 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134936 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.134987 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135031 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135083 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135104 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135127 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.135149 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236344 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236404 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236435 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236510 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236544 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236572 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236598 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236638 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236685 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236737 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.236957 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237139 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237172 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237377 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237615 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237859 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.237918 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.238661 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.243005 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.246402 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.263382 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"service-telemetry-operator-bundle-1-build\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.345751 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.434840 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} Mar 19 00:32:13 crc kubenswrapper[4745]: I0319 00:32:13.691656 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:13 crc kubenswrapper[4745]: W0319 00:32:13.693009 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b87c190_e4f0_4423_bcb7_942badcf90a9.slice/crio-58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9 WatchSource:0}: Error finding container 58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9: Status 404 returned error can't find the container with id 58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.444353 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" exitCode=0 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.444459 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447529 4745 generic.go:334] "Generic (PLEG): container finished" podID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerID="974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92" exitCode=0 Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447658 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92"} Mar 19 00:32:14 crc kubenswrapper[4745]: I0319 00:32:14.447779 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerStarted","Data":"58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9"} Mar 19 00:32:15 crc kubenswrapper[4745]: I0319 00:32:15.460531 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerStarted","Data":"f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a"} Mar 19 00:32:15 crc kubenswrapper[4745]: I0319 00:32:15.495136 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-1-build" podStartSLOduration=2.495104659 podStartE2EDuration="2.495104659s" podCreationTimestamp="2026-03-19 00:32:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:15.494227733 +0000 UTC m=+1500.032422884" watchObservedRunningTime="2026-03-19 00:32:15.495104659 +0000 UTC m=+1500.033299790" Mar 19 00:32:19 crc kubenswrapper[4745]: I0319 00:32:19.510457 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerStarted","Data":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} Mar 19 00:32:19 crc kubenswrapper[4745]: I0319 00:32:19.536767 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cwrwg" podStartSLOduration=6.0298041 podStartE2EDuration="9.536745421s" podCreationTimestamp="2026-03-19 00:32:10 +0000 UTC" firstStartedPulling="2026-03-19 00:32:12.394078698 +0000 UTC m=+1496.932273829" lastFinishedPulling="2026-03-19 00:32:15.901020019 +0000 UTC m=+1500.439215150" observedRunningTime="2026-03-19 00:32:19.535572594 +0000 UTC m=+1504.073767735" watchObservedRunningTime="2026-03-19 00:32:19.536745421 +0000 UTC m=+1504.074940542" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.519688 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.521456 4745 generic.go:334] "Generic (PLEG): container finished" podID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerID="f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a" exitCode=1 Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.521589 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a"} Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.704635 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:20 crc kubenswrapper[4745]: I0319 00:32:20.705129 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.752441 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-cwrwg" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" probeResult="failure" output=< Mar 19 00:32:21 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:32:21 crc kubenswrapper[4745]: > Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.784123 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.784559 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880351 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880436 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880485 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880494 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880527 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880554 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880610 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880633 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880664 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880697 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880703 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.880966 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881117 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881140 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") pod \"7b87c190-e4f0-4423-bcb7-942badcf90a9\" (UID: \"7b87c190-e4f0-4423-bcb7-942badcf90a9\") " Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.881804 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882389 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882269 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882291 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882728 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882731 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882973 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.882997 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.883456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.888604 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.888668 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl" (OuterVolumeSpecName: "kube-api-access-9cngl") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "kube-api-access-9cngl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.889048 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "7b87c190-e4f0-4423-bcb7-942badcf90a9" (UID: "7b87c190-e4f0-4423-bcb7-942badcf90a9"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983771 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983811 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983823 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983832 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983843 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/7b87c190-e4f0-4423-bcb7-942badcf90a9-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983851 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983863 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983873 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/7b87c190-e4f0-4423-bcb7-942badcf90a9-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983908 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cngl\" (UniqueName: \"kubernetes.io/projected/7b87c190-e4f0-4423-bcb7-942badcf90a9-kube-api-access-9cngl\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:21 crc kubenswrapper[4745]: I0319 00:32:21.983917 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/7b87c190-e4f0-4423-bcb7-942badcf90a9-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.537848 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-1-build_7b87c190-e4f0-4423-bcb7-942badcf90a9/docker-build/0.log" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538444 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-1-build" event={"ID":"7b87c190-e4f0-4423-bcb7-942badcf90a9","Type":"ContainerDied","Data":"58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9"} Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538501 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58b49faa453787cdddcf37add99bc7b8bfd7ed03549b92efb668128ff84b7fc9" Mar 19 00:32:22 crc kubenswrapper[4745]: I0319 00:32:22.538734 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-1-build" Mar 19 00:32:23 crc kubenswrapper[4745]: I0319 00:32:23.523057 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:23 crc kubenswrapper[4745]: I0319 00:32:23.529873 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-1-build"] Mar 19 00:32:24 crc kubenswrapper[4745]: I0319 00:32:24.147316 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" path="/var/lib/kubelet/pods/7b87c190-e4f0-4423-bcb7-942badcf90a9/volumes" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206329 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:25 crc kubenswrapper[4745]: E0319 00:32:25.206667 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206681 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: E0319 00:32:25.206693 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="manage-dockerfile" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206700 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="manage-dockerfile" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.206809 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b87c190-e4f0-4423-bcb7-942badcf90a9" containerName="docker-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.209193 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212084 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-sys-config" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212084 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-global-ca" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212089 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.212686 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-bundle-2-ca" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.228272 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334738 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334782 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334808 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334835 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.334855 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335072 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335137 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335378 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335506 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335602 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.335701 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.436873 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437450 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437547 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437452 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437630 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437803 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.437987 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438068 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438108 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438212 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438339 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438490 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438745 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438519 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.438794 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439007 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439385 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.439608 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.444246 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.444297 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.458922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"service-telemetry-operator-bundle-2-build\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.527070 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:25 crc kubenswrapper[4745]: I0319 00:32:25.765293 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-bundle-2-build"] Mar 19 00:32:26 crc kubenswrapper[4745]: I0319 00:32:26.571604 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054"} Mar 19 00:32:26 crc kubenswrapper[4745]: I0319 00:32:26.572011 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e"} Mar 19 00:32:27 crc kubenswrapper[4745]: I0319 00:32:27.580327 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054" exitCode=0 Mar 19 00:32:27 crc kubenswrapper[4745]: I0319 00:32:27.580386 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"34a689b37d20c977ba6ca58741a9591bcdc32db9ecb4715c1a496a9b03fde054"} Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.594202 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="9742c972b6ec6febdd5f2783832bb25d24125697d9b00e4733b8583e4a9188f1" exitCode=0 Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.594391 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"9742c972b6ec6febdd5f2783832bb25d24125697d9b00e4733b8583e4a9188f1"} Mar 19 00:32:28 crc kubenswrapper[4745]: I0319 00:32:28.632870 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-bundle-2-build_00084aa5-c66b-4c62-a0c3-422e3c02286a/manage-dockerfile/0.log" Mar 19 00:32:29 crc kubenswrapper[4745]: I0319 00:32:29.605237 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerStarted","Data":"b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79"} Mar 19 00:32:29 crc kubenswrapper[4745]: I0319 00:32:29.631554 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-bundle-2-build" podStartSLOduration=4.631528646 podStartE2EDuration="4.631528646s" podCreationTimestamp="2026-03-19 00:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:29.631152714 +0000 UTC m=+1514.169347865" watchObservedRunningTime="2026-03-19 00:32:29.631528646 +0000 UTC m=+1514.169723787" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.747843 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.792962 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:30 crc kubenswrapper[4745]: I0319 00:32:30.990954 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:31 crc kubenswrapper[4745]: I0319 00:32:31.623027 4745 generic.go:334] "Generic (PLEG): container finished" podID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerID="b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79" exitCode=0 Mar 19 00:32:31 crc kubenswrapper[4745]: I0319 00:32:31.623072 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"b4ec71f0453639d110d361d2fd4ca7b279631514bab4c9ba34ffe4b52644ca79"} Mar 19 00:32:32 crc kubenswrapper[4745]: I0319 00:32:32.630630 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cwrwg" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" containerID="cri-o://ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" gracePeriod=2 Mar 19 00:32:32 crc kubenswrapper[4745]: I0319 00:32:32.900126 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.013057 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050383 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050448 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050504 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050571 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050594 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050626 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050650 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050675 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050711 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050742 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050763 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.050794 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") pod \"00084aa5-c66b-4c62-a0c3-422e3c02286a\" (UID: \"00084aa5-c66b-4c62-a0c3-422e3c02286a\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.051186 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.051499 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052284 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052328 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052667 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.052779 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.054507 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.055076 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.057145 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.058329 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.058433 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr" (OuterVolumeSpecName: "kube-api-access-lvpkr") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "kube-api-access-lvpkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.061423 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "00084aa5-c66b-4c62-a0c3-422e3c02286a" (UID: "00084aa5-c66b-4c62-a0c3-422e3c02286a"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152028 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152244 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") pod \"197ef947-3c30-4a50-ade4-01f72410e5cf\" (UID: \"197ef947-3c30-4a50-ade4-01f72410e5cf\") " Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152526 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvpkr\" (UniqueName: \"kubernetes.io/projected/00084aa5-c66b-4c62-a0c3-422e3c02286a-kube-api-access-lvpkr\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152546 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152559 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152572 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152582 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152593 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152603 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152613 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/00084aa5-c66b-4c62-a0c3-422e3c02286a-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152623 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/00084aa5-c66b-4c62-a0c3-422e3c02286a-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152634 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152645 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/00084aa5-c66b-4c62-a0c3-422e3c02286a-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.152658 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/00084aa5-c66b-4c62-a0c3-422e3c02286a-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.154034 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities" (OuterVolumeSpecName: "utilities") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.156051 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7" (OuterVolumeSpecName: "kube-api-access-n4wz7") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "kube-api-access-n4wz7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.254445 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.254717 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4wz7\" (UniqueName: \"kubernetes.io/projected/197ef947-3c30-4a50-ade4-01f72410e5cf-kube-api-access-n4wz7\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.294450 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "197ef947-3c30-4a50-ade4-01f72410e5cf" (UID: "197ef947-3c30-4a50-ade4-01f72410e5cf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.355760 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/197ef947-3c30-4a50-ade4-01f72410e5cf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.638125 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-bundle-2-build" event={"ID":"00084aa5-c66b-4c62-a0c3-422e3c02286a","Type":"ContainerDied","Data":"95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.638184 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95124c5725c9d01c054231894a7b4772b99823bf7deea19e1ef7de9c84f0ca6e" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.639417 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-bundle-2-build" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640168 4745 generic.go:334] "Generic (PLEG): container finished" podID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" exitCode=0 Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640203 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cwrwg" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640206 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640295 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cwrwg" event={"ID":"197ef947-3c30-4a50-ade4-01f72410e5cf","Type":"ContainerDied","Data":"9f4642b49f8bb7f219967f76570586d6870f2fbf5d9b181304387179ff6fbf3b"} Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.640317 4745 scope.go:117] "RemoveContainer" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.665313 4745 scope.go:117] "RemoveContainer" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.681145 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.688279 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cwrwg"] Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.704722 4745 scope.go:117] "RemoveContainer" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720242 4745 scope.go:117] "RemoveContainer" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.720714 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": container with ID starting with ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d not found: ID does not exist" containerID="ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720777 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d"} err="failed to get container status \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": rpc error: code = NotFound desc = could not find container \"ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d\": container with ID starting with ee0b703f5d8b4edacc3cb8bdf8f148a0906a8c99fc7949de63d61c710803865d not found: ID does not exist" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.720810 4745 scope.go:117] "RemoveContainer" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.721270 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": container with ID starting with fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7 not found: ID does not exist" containerID="fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721294 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7"} err="failed to get container status \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": rpc error: code = NotFound desc = could not find container \"fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7\": container with ID starting with fbefc99a0358ccd3c6172d2bfb56aa81b1482afdfe80ea7a7c24e7223d331aa7 not found: ID does not exist" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721311 4745 scope.go:117] "RemoveContainer" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: E0319 00:32:33.721567 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": container with ID starting with 6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8 not found: ID does not exist" containerID="6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8" Mar 19 00:32:33 crc kubenswrapper[4745]: I0319 00:32:33.721588 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8"} err="failed to get container status \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": rpc error: code = NotFound desc = could not find container \"6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8\": container with ID starting with 6e9b8340337dced8c1d5e31d98348b91527434acab8a54311f0dbca6178612e8 not found: ID does not exist" Mar 19 00:32:34 crc kubenswrapper[4745]: I0319 00:32:34.147158 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" path="/var/lib/kubelet/pods/197ef947-3c30-4a50-ade4-01f72410e5cf/volumes" Mar 19 00:32:35 crc kubenswrapper[4745]: I0319 00:32:35.471772 4745 scope.go:117] "RemoveContainer" containerID="6373326992b421fd82709a30cf5b66d6c20b4b21e2598084ed73e4aa5185678e" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656196 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656566 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-utilities" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656585 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-utilities" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656595 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="manage-dockerfile" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656603 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="manage-dockerfile" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656617 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-content" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656630 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="extract-content" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656648 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="git-clone" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656656 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="git-clone" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656682 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656689 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: E0319 00:32:36.656696 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656706 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656838 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="00084aa5-c66b-4c62-a0c3-422e3c02286a" containerName="docker-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.656854 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="197ef947-3c30-4a50-ade4-01f72410e5cf" containerName="registry-server" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.657662 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660407 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-ca" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660463 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-sys-config" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.660974 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-1-global-ca" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.661325 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.677287 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.801928 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802023 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802053 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802113 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802145 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802186 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802206 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802357 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802618 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802751 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.802816 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.903819 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904375 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904425 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904450 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904474 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904553 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904579 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904621 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904654 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904687 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904727 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.904721 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905104 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905126 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905180 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905368 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.905375 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906264 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906321 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.906871 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.911585 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.917922 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.923640 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"smart-gateway-operator-bundle-1-build\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:36 crc kubenswrapper[4745]: I0319 00:32:36.974583 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.184161 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672756 4745 generic.go:334] "Generic (PLEG): container finished" podID="479407cb-fdef-474d-a564-881954a984db" containerID="41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c" exitCode=0 Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672835 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c"} Mar 19 00:32:37 crc kubenswrapper[4745]: I0319 00:32:37.672921 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerStarted","Data":"66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624"} Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.682000 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.683036 4745 generic.go:334] "Generic (PLEG): container finished" podID="479407cb-fdef-474d-a564-881954a984db" containerID="3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1" exitCode=1 Mar 19 00:32:38 crc kubenswrapper[4745]: I0319 00:32:38.683097 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1"} Mar 19 00:32:39 crc kubenswrapper[4745]: I0319 00:32:39.986796 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:39 crc kubenswrapper[4745]: I0319 00:32:39.987566 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052275 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052339 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052361 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052406 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052426 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052648 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052675 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052702 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052730 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052755 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052786 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052840 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") pod \"479407cb-fdef-474d-a564-881954a984db\" (UID: \"479407cb-fdef-474d-a564-881954a984db\") " Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.052914 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053192 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053210 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/479407cb-fdef-474d-a564-881954a984db-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053249 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053614 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053653 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053672 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.053807 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.054070 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.054413 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058070 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2" (OuterVolumeSpecName: "kube-api-access-k5pl2") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "kube-api-access-k5pl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.058347 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "479407cb-fdef-474d-a564-881954a984db" (UID: "479407cb-fdef-474d-a564-881954a984db"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154102 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154134 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154143 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/479407cb-fdef-474d-a564-881954a984db-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154152 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154162 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154170 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/479407cb-fdef-474d-a564-881954a984db-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154178 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154187 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154196 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k5pl2\" (UniqueName: \"kubernetes.io/projected/479407cb-fdef-474d-a564-881954a984db-kube-api-access-k5pl2\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.154206 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/479407cb-fdef-474d-a564-881954a984db-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.698971 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-1-build_479407cb-fdef-474d-a564-881954a984db/docker-build/0.log" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699853 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-1-build" event={"ID":"479407cb-fdef-474d-a564-881954a984db","Type":"ContainerDied","Data":"66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624"} Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699907 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="66300060985a6ce250cf2a184b02513f9286f734f0e799ca10c3c982703dd624" Mar 19 00:32:40 crc kubenswrapper[4745]: I0319 00:32:40.699998 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-1-build" Mar 19 00:32:45 crc kubenswrapper[4745]: I0319 00:32:45.606707 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:32:45 crc kubenswrapper[4745]: I0319 00:32:45.608043 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:32:47 crc kubenswrapper[4745]: I0319 00:32:47.119316 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:47 crc kubenswrapper[4745]: I0319 00:32:47.128653 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-1-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.147286 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="479407cb-fdef-474d-a564-881954a984db" path="/var/lib/kubelet/pods/479407cb-fdef-474d-a564-881954a984db/volumes" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824296 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: E0319 00:32:48.824633 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="manage-dockerfile" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824651 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="manage-dockerfile" Mar 19 00:32:48 crc kubenswrapper[4745]: E0319 00:32:48.824673 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824681 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.824843 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="479407cb-fdef-474d-a564-881954a984db" containerName="docker-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.825998 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.828987 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-sys-config" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829104 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-ca" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.829260 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"smart-gateway-operator-bundle-2-global-ca" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.850639 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.981754 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982315 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982367 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982389 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982415 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982435 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982467 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982489 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982514 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982554 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982604 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:48 crc kubenswrapper[4745]: I0319 00:32:48.982627 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084350 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084439 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084475 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084503 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084526 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084566 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084616 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084634 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084656 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084672 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084688 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.084953 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085011 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085117 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085211 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085345 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085507 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085561 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085619 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.085927 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.091919 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.096451 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.109913 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"smart-gateway-operator-bundle-2-build\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.149255 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.460277 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-bundle-2-build"] Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.787469 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f"} Mar 19 00:32:49 crc kubenswrapper[4745]: I0319 00:32:49.787536 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d"} Mar 19 00:32:50 crc kubenswrapper[4745]: I0319 00:32:50.797347 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f" exitCode=0 Mar 19 00:32:50 crc kubenswrapper[4745]: I0319 00:32:50.797435 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"fee07e53cb9d15670b053d3a0bff1e14a6fd7d3736db84b1b91153328d1b6b1f"} Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.807735 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="f98cd947aae65c5442381e3256226970531d2fb07bf7814d884c6af3441f3d64" exitCode=0 Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.807811 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"f98cd947aae65c5442381e3256226970531d2fb07bf7814d884c6af3441f3d64"} Mar 19 00:32:51 crc kubenswrapper[4745]: I0319 00:32:51.852169 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-bundle-2-build_9faac743-5f49-4d6a-bb53-da4b1178ee26/manage-dockerfile/0.log" Mar 19 00:32:52 crc kubenswrapper[4745]: I0319 00:32:52.817244 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerStarted","Data":"fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e"} Mar 19 00:32:52 crc kubenswrapper[4745]: I0319 00:32:52.854355 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-bundle-2-build" podStartSLOduration=4.854327183 podStartE2EDuration="4.854327183s" podCreationTimestamp="2026-03-19 00:32:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:32:52.848368357 +0000 UTC m=+1537.386563488" watchObservedRunningTime="2026-03-19 00:32:52.854327183 +0000 UTC m=+1537.392522314" Mar 19 00:32:54 crc kubenswrapper[4745]: I0319 00:32:54.831659 4745 generic.go:334] "Generic (PLEG): container finished" podID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerID="fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e" exitCode=0 Mar 19 00:32:54 crc kubenswrapper[4745]: I0319 00:32:54.831856 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"fea6a727e51f4292705620c79681af6b9ca4114d156adcb01d758ba669b14e5e"} Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.095102 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196408 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196477 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196515 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196615 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196641 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196668 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196688 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196715 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196737 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196760 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.196781 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") pod \"9faac743-5f49-4d6a-bb53-da4b1178ee26\" (UID: \"9faac743-5f49-4d6a-bb53-da4b1178ee26\") " Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197059 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197295 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.197824 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198119 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198363 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.198610 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.200015 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.201627 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.203147 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw" (OuterVolumeSpecName: "kube-api-access-pn2fw") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "kube-api-access-pn2fw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.204065 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.209118 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "9faac743-5f49-4d6a-bb53-da4b1178ee26" (UID: "9faac743-5f49-4d6a-bb53-da4b1178ee26"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298046 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298090 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298101 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298114 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298123 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9faac743-5f49-4d6a-bb53-da4b1178ee26-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298131 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298140 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/9faac743-5f49-4d6a-bb53-da4b1178ee26-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298150 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298162 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pn2fw\" (UniqueName: \"kubernetes.io/projected/9faac743-5f49-4d6a-bb53-da4b1178ee26-kube-api-access-pn2fw\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298174 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298185 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9faac743-5f49-4d6a-bb53-da4b1178ee26-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.298194 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9faac743-5f49-4d6a-bb53-da4b1178ee26-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.848928 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-bundle-2-build" event={"ID":"9faac743-5f49-4d6a-bb53-da4b1178ee26","Type":"ContainerDied","Data":"f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d"} Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.848982 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1794e32a011e5644ffc79544be3fbbd276268a0dd008ac9067dc718c52ae74d" Mar 19 00:32:56 crc kubenswrapper[4745]: I0319 00:32:56.849021 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-bundle-2-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.981583 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982567 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="git-clone" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982583 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="git-clone" Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982615 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="manage-dockerfile" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982623 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="manage-dockerfile" Mar 19 00:33:11 crc kubenswrapper[4745]: E0319 00:33:11.982633 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982641 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.982775 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="9faac743-5f49-4d6a-bb53-da4b1178ee26" containerName="docker-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.983714 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.985730 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-vcnqb" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.985735 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-global-ca" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.986318 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-sys-config" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.986701 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-framework-index-dockercfg" Mar 19 00:33:11 crc kubenswrapper[4745]: I0319 00:33:11.988678 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-framework-index-1-ca" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.006539 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137132 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137211 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137269 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137300 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137334 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137366 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137397 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137424 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137486 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137521 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137548 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.137576 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239376 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239412 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239436 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239460 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239481 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239564 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239594 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239611 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239632 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239660 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239686 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.239703 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240023 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240136 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240381 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240437 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.240891 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241181 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.241325 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.246294 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.246318 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.251509 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.260180 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"service-telemetry-framework-index-1-build\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.308658 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.523470 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-framework-index-1-build"] Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.971740 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc"} Mar 19 00:33:12 crc kubenswrapper[4745]: I0319 00:33:12.972250 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1"} Mar 19 00:33:13 crc kubenswrapper[4745]: I0319 00:33:13.979419 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc" exitCode=0 Mar 19 00:33:13 crc kubenswrapper[4745]: I0319 00:33:13.979479 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"d9edc2ec4c7404182e0b8c14859b05a93541996296f20b1e6a8161d5fc3e98fc"} Mar 19 00:33:14 crc kubenswrapper[4745]: I0319 00:33:14.986832 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="e53bb767b00c8d5bf2fa027d616f337147f469239d5f1158a1e3956e946eeab2" exitCode=0 Mar 19 00:33:14 crc kubenswrapper[4745]: I0319 00:33:14.986937 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"e53bb767b00c8d5bf2fa027d616f337147f469239d5f1158a1e3956e946eeab2"} Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.041764 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-framework-index-1-build_047a89c0-d73e-49d9-bb4e-b01fcefe54a6/manage-dockerfile/0.log" Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.606229 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.606748 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:33:15 crc kubenswrapper[4745]: I0319 00:33:15.998934 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerStarted","Data":"835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147"} Mar 19 00:33:16 crc kubenswrapper[4745]: I0319 00:33:16.028466 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-framework-index-1-build" podStartSLOduration=5.028444837 podStartE2EDuration="5.028444837s" podCreationTimestamp="2026-03-19 00:33:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:33:16.027149306 +0000 UTC m=+1560.565344437" watchObservedRunningTime="2026-03-19 00:33:16.028444837 +0000 UTC m=+1560.566639958" Mar 19 00:33:44 crc kubenswrapper[4745]: I0319 00:33:44.203566 4745 generic.go:334] "Generic (PLEG): container finished" podID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerID="835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147" exitCode=0 Mar 19 00:33:44 crc kubenswrapper[4745]: I0319 00:33:44.203676 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"835795f6ed7df0d4af5545ce929b9f07a201501b5f1a75e3ad659c65d69b3147"} Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.557507 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606167 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606248 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.606315 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.607136 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.607208 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" gracePeriod=600 Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650178 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650231 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650321 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650349 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650445 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650529 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.650458 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651186 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651284 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651344 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651390 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651441 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651491 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651544 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.651608 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") pod \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\" (UID: \"047a89c0-d73e-49d9-bb4e-b01fcefe54a6\") " Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652260 4745 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildcachedir\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652288 4745 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-system-configs\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652306 4745 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652616 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.652736 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.653182 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.656160 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.657456 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume" (OuterVolumeSpecName: "service-telemetry-framework-index-dockercfg-user-build-volume") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "service-telemetry-framework-index-dockercfg-user-build-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.657518 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv" (OuterVolumeSpecName: "kube-api-access-frfhv") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "kube-api-access-frfhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.658023 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-pull") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.660042 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push" (OuterVolumeSpecName: "builder-dockercfg-vcnqb-push") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "builder-dockercfg-vcnqb-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: E0319 00:33:45.739380 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753401 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-run\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753437 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-frfhv\" (UniqueName: \"kubernetes.io/projected/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-kube-api-access-frfhv\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753447 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-push\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-push\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753460 4745 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-vcnqb-pull\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-builder-dockercfg-vcnqb-pull\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753470 4745 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753479 4745 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753490 4745 reconciler_common.go:293] "Volume detached for volume \"service-telemetry-framework-index-dockercfg-user-build-volume\" (UniqueName: \"kubernetes.io/secret/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-service-telemetry-framework-index-dockercfg-user-build-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.753502 4745 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-buildworkdir\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.921403 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:45 crc kubenswrapper[4745]: I0319 00:33:45.956617 4745 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-build-blob-cache\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223757 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-framework-index-1-build" event={"ID":"047a89c0-d73e-49d9-bb4e-b01fcefe54a6","Type":"ContainerDied","Data":"56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1"} Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223829 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56d6c6c471c42b322ae3628ab1707980810345401b838e1eae669d5f0a071ce1" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.223776 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-framework-index-1-build" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226031 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" exitCode=0 Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226079 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2"} Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226127 4745 scope.go:117] "RemoveContainer" containerID="596b4f8d9ce25b5331069b1483120c23162574341fd13a90efeccafcfe8087f5" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.226474 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:33:46 crc kubenswrapper[4745]: E0319 00:33:46.226732 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.602011 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "047a89c0-d73e-49d9-bb4e-b01fcefe54a6" (UID: "047a89c0-d73e-49d9-bb4e-b01fcefe54a6"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:33:46 crc kubenswrapper[4745]: I0319 00:33:46.671041 4745 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/047a89c0-d73e-49d9-bb4e-b01fcefe54a6-container-storage-root\") on node \"crc\" DevicePath \"\"" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.114663 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115068 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="manage-dockerfile" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115086 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="manage-dockerfile" Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115104 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115114 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: E0319 00:33:48.115135 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="git-clone" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115146 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="git-clone" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.115289 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="047a89c0-d73e-49d9-bb4e-b01fcefe54a6" containerName="docker-build" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.116018 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.119084 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"infrawatch-operators-dockercfg-f8hd8" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.124850 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.195085 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.296322 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.317998 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"infrawatch-operators-thzc6\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.449071 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.679406 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:48 crc kubenswrapper[4745]: I0319 00:33:48.686496 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:33:49 crc kubenswrapper[4745]: I0319 00:33:49.255948 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-thzc6" event={"ID":"d546c548-8ccd-4a8f-b790-7ba7e7340939","Type":"ContainerStarted","Data":"20f2ca013f9548cfbe6b25313aa9e1d6ca51c45681defe7c66c725315be6f45b"} Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.697504 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.915106 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.916691 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:50 crc kubenswrapper[4745]: I0319 00:33:50.925265 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.039850 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.142187 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.168930 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxwjc\" (UniqueName: \"kubernetes.io/projected/9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b-kube-api-access-nxwjc\") pod \"infrawatch-operators-mrqcc\" (UID: \"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b\") " pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.241312 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:33:51 crc kubenswrapper[4745]: I0319 00:33:51.670501 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-mrqcc"] Mar 19 00:33:54 crc kubenswrapper[4745]: I0319 00:33:54.300022 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-mrqcc" event={"ID":"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b","Type":"ContainerStarted","Data":"cb67907a7b34baccdc293c4638d2b2130457235999263af3e6d8ae28aa5a13b2"} Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.147938 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.149426 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.150844 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154424 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154515 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.154694 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.181446 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.282590 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.305596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"auto-csr-approver-29564674-2zhcp\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:00 crc kubenswrapper[4745]: I0319 00:34:00.473229 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:01 crc kubenswrapper[4745]: I0319 00:34:01.138350 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:01 crc kubenswrapper[4745]: E0319 00:34:01.139620 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.034449 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.035187 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:registry-server,Image:image-registry.openshift-image-registry.svc:5000/service-telemetry/service-telemetry-framework-index:latest,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:grpc,HostPort:0,ContainerPort:50051,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hj2tv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[grpc_health_probe -addr=:50051],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:10,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod infrawatch-operators-thzc6_service-telemetry(d546c548-8ccd-4a8f-b790-7ba7e7340939): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:34:04 crc kubenswrapper[4745]: E0319 00:34:04.036466 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"registry-server\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/infrawatch-operators-thzc6" podUID="d546c548-8ccd-4a8f-b790-7ba7e7340939" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.383502 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-mrqcc" event={"ID":"9d01cbfd-53e4-4afc-8cc9-00f2b1694c2b","Type":"ContainerStarted","Data":"d8b30130338c71d17fef775727bdedeb621de7a2d31a5998975d589b9409aec7"} Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.394816 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.441029 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-mrqcc" podStartSLOduration=4.488774318 podStartE2EDuration="14.441001424s" podCreationTimestamp="2026-03-19 00:33:50 +0000 UTC" firstStartedPulling="2026-03-19 00:33:54.122388973 +0000 UTC m=+1598.660584104" lastFinishedPulling="2026-03-19 00:34:04.074616079 +0000 UTC m=+1608.612811210" observedRunningTime="2026-03-19 00:34:04.433635725 +0000 UTC m=+1608.971830876" watchObservedRunningTime="2026-03-19 00:34:04.441001424 +0000 UTC m=+1608.979196555" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.604713 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.649915 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") pod \"d546c548-8ccd-4a8f-b790-7ba7e7340939\" (UID: \"d546c548-8ccd-4a8f-b790-7ba7e7340939\") " Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.656124 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv" (OuterVolumeSpecName: "kube-api-access-hj2tv") pod "d546c548-8ccd-4a8f-b790-7ba7e7340939" (UID: "d546c548-8ccd-4a8f-b790-7ba7e7340939"). InnerVolumeSpecName "kube-api-access-hj2tv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:04 crc kubenswrapper[4745]: I0319 00:34:04.751667 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hj2tv\" (UniqueName: \"kubernetes.io/projected/d546c548-8ccd-4a8f-b790-7ba7e7340939-kube-api-access-hj2tv\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.396397 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerStarted","Data":"77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965"} Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.399334 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-thzc6" event={"ID":"d546c548-8ccd-4a8f-b790-7ba7e7340939","Type":"ContainerDied","Data":"20f2ca013f9548cfbe6b25313aa9e1d6ca51c45681defe7c66c725315be6f45b"} Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.399367 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-thzc6" Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.482120 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:34:05 crc kubenswrapper[4745]: I0319 00:34:05.501348 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-thzc6"] Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.149632 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d546c548-8ccd-4a8f-b790-7ba7e7340939" path="/var/lib/kubelet/pods/d546c548-8ccd-4a8f-b790-7ba7e7340939/volumes" Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.409299 4745 generic.go:334] "Generic (PLEG): container finished" podID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerID="6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68" exitCode=0 Mar 19 00:34:06 crc kubenswrapper[4745]: I0319 00:34:06.409367 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerDied","Data":"6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68"} Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.699391 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.798742 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") pod \"c9fad81f-d73f-4e01-9a07-66b20741533e\" (UID: \"c9fad81f-d73f-4e01-9a07-66b20741533e\") " Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.806220 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2" (OuterVolumeSpecName: "kube-api-access-nktw2") pod "c9fad81f-d73f-4e01-9a07-66b20741533e" (UID: "c9fad81f-d73f-4e01-9a07-66b20741533e"). InnerVolumeSpecName "kube-api-access-nktw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:07 crc kubenswrapper[4745]: I0319 00:34:07.900994 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nktw2\" (UniqueName: \"kubernetes.io/projected/c9fad81f-d73f-4e01-9a07-66b20741533e-kube-api-access-nktw2\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.426932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" event={"ID":"c9fad81f-d73f-4e01-9a07-66b20741533e","Type":"ContainerDied","Data":"77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965"} Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.427340 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77c6e8fd219209e3c00449d14e53d4b561cf95d96fc04068d5c4a90d228c1965" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.427028 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564674-2zhcp" Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.771314 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:34:08 crc kubenswrapper[4745]: I0319 00:34:08.775866 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564668-66fs6"] Mar 19 00:34:10 crc kubenswrapper[4745]: I0319 00:34:10.147323 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9ad0116-35eb-40db-8d57-4501affdf59c" path="/var/lib/kubelet/pods/c9ad0116-35eb-40db-8d57-4501affdf59c/volumes" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.241933 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.243028 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.274739 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:11 crc kubenswrapper[4745]: I0319 00:34:11.475432 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-mrqcc" Mar 19 00:34:15 crc kubenswrapper[4745]: I0319 00:34:15.138264 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:15 crc kubenswrapper[4745]: E0319 00:34:15.138917 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.549922 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:23 crc kubenswrapper[4745]: E0319 00:34:23.551045 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.551067 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.551195 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" containerName="oc" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.552259 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.564074 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722240 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722742 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.722955 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824411 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824471 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.824560 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.825165 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.825324 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.849549 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:23 crc kubenswrapper[4745]: I0319 00:34:23.869490 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.326708 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.370477 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.372239 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.389235 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.543919 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.544000 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.544061 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.545007 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerStarted","Data":"5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa"} Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645405 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645529 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.645741 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.646623 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.646725 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.670046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.745405 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:24 crc kubenswrapper[4745]: I0319 00:34:24.972506 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4"] Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.555939 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="db1da1c178895583cbbeb02f9a019334c12324d85bb84b0829090b6fec562ec1" exitCode=0 Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.556066 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"db1da1c178895583cbbeb02f9a019334c12324d85bb84b0829090b6fec562ec1"} Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.556490 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerStarted","Data":"221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7"} Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.560451 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="7aa2fb188e8dbc02204f77cd7f122296d6d12e090d081261a29dbfdb2dfd8a37" exitCode=0 Mar 19 00:34:25 crc kubenswrapper[4745]: I0319 00:34:25.560508 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"7aa2fb188e8dbc02204f77cd7f122296d6d12e090d081261a29dbfdb2dfd8a37"} Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.569350 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="18c9701b802a7d35818e5a8247b67c03d39b001d0d53ccd381820ef55a41674a" exitCode=0 Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.569726 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"18c9701b802a7d35818e5a8247b67c03d39b001d0d53ccd381820ef55a41674a"} Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.571663 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="5846b2d515174b602b71e1177491af56a1f227f0410de497e5d9d1922b6a45fa" exitCode=0 Mar 19 00:34:26 crc kubenswrapper[4745]: I0319 00:34:26.571711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"5846b2d515174b602b71e1177491af56a1f227f0410de497e5d9d1922b6a45fa"} Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.582563 4745 generic.go:334] "Generic (PLEG): container finished" podID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerID="a99982be9fea52e3d27bfed2ebd1ecfe164ace8fa077144182d10a2dd4a2c8d3" exitCode=0 Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.582683 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"a99982be9fea52e3d27bfed2ebd1ecfe164ace8fa077144182d10a2dd4a2c8d3"} Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.585618 4745 generic.go:334] "Generic (PLEG): container finished" podID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerID="63096d30197409b38b02c27f3376d5ad2b7f979623cb77957e05fe00ffc5827a" exitCode=0 Mar 19 00:34:27 crc kubenswrapper[4745]: I0319 00:34:27.585657 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"63096d30197409b38b02c27f3376d5ad2b7f979623cb77957e05fe00ffc5827a"} Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.138753 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:28 crc kubenswrapper[4745]: E0319 00:34:28.139434 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.880317 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:28 crc kubenswrapper[4745]: I0319 00:34:28.885354 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010228 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010657 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010691 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010739 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010822 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") pod \"61a70d11-ba59-4ad2-8427-c28882835ad6\" (UID: \"61a70d11-ba59-4ad2-8427-c28882835ad6\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.010875 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") pod \"34f163ba-1cf2-47e5-847f-8db4eac30c29\" (UID: \"34f163ba-1cf2-47e5-847f-8db4eac30c29\") " Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.011798 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle" (OuterVolumeSpecName: "bundle") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.011814 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle" (OuterVolumeSpecName: "bundle") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.012357 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.012383 4745 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.021130 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4" (OuterVolumeSpecName: "kube-api-access-25dr4") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "kube-api-access-25dr4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.021189 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g" (OuterVolumeSpecName: "kube-api-access-zqr4g") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "kube-api-access-zqr4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.026489 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util" (OuterVolumeSpecName: "util") pod "34f163ba-1cf2-47e5-847f-8db4eac30c29" (UID: "34f163ba-1cf2-47e5-847f-8db4eac30c29"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.026550 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util" (OuterVolumeSpecName: "util") pod "61a70d11-ba59-4ad2-8427-c28882835ad6" (UID: "61a70d11-ba59-4ad2-8427-c28882835ad6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113698 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/61a70d11-ba59-4ad2-8427-c28882835ad6-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113733 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqr4g\" (UniqueName: \"kubernetes.io/projected/61a70d11-ba59-4ad2-8427-c28882835ad6-kube-api-access-zqr4g\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113742 4745 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/34f163ba-1cf2-47e5-847f-8db4eac30c29-util\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.113753 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-25dr4\" (UniqueName: \"kubernetes.io/projected/34f163ba-1cf2-47e5-847f-8db4eac30c29-kube-api-access-25dr4\") on node \"crc\" DevicePath \"\"" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.603839 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" event={"ID":"61a70d11-ba59-4ad2-8427-c28882835ad6","Type":"ContainerDied","Data":"5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa"} Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.604282 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e3d52819cccf117ee2102d83ab5a41d16baa92ad9b71c6b5093655e59ea25aa" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.603899 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/500c4f010310dad14c569d8fa2124fef1cf701af50ed1128cec4daf65a6tk24" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.605994 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" event={"ID":"34f163ba-1cf2-47e5-847f-8db4eac30c29","Type":"ContainerDied","Data":"221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7"} Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.606038 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="221db3a012c61b898efbb2032e252e9399d8603ae3c537e79d9ed30671636fc7" Mar 19 00:34:29 crc kubenswrapper[4745]: I0319 00:34:29.606124 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/372e7d5daac88c2e9a91443a2f508c8c20ad57bc41b1606ec960d61c09xpwl4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.046277 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047009 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047024 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047033 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047039 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047058 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047064 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="pull" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047073 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047079 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047090 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047096 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: E0319 00:34:34.047105 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047110 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="util" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047265 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="61a70d11-ba59-4ad2-8427-c28882835ad6" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047280 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="34f163ba-1cf2-47e5-847f-8db4eac30c29" containerName="extract" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.047898 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.063640 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.067409 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-operator-dockercfg-x5fd7" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.189171 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.189814 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.291247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.291312 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.293436 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/a8895e98-6a3f-4f8a-b671-9a7920ceb390-runner\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.324046 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68sr2\" (UniqueName: \"kubernetes.io/projected/a8895e98-6a3f-4f8a-b671-9a7920ceb390-kube-api-access-68sr2\") pod \"smart-gateway-operator-ff5d8cc8d-tcxb4\" (UID: \"a8895e98-6a3f-4f8a-b671-9a7920ceb390\") " pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.367676 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.618237 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4"] Mar 19 00:34:34 crc kubenswrapper[4745]: I0319 00:34:34.658420 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" event={"ID":"a8895e98-6a3f-4f8a-b671-9a7920ceb390","Type":"ContainerStarted","Data":"fae757808023fb173f247138a7bb6d526986f626e1779762d3339598075a7ca4"} Mar 19 00:34:35 crc kubenswrapper[4745]: I0319 00:34:35.561782 4745 scope.go:117] "RemoveContainer" containerID="08e036dc6c9a44bd6fdc8a12f3525fb5e0bf5c4fdd30613e6e3e3b5a2939ce17" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.186068 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.187690 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.195726 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"service-telemetry-operator-dockercfg-zljj6" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.226306 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.347924 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.348027 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449110 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449209 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.449721 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"runner\" (UniqueName: \"kubernetes.io/empty-dir/4754eb2f-bab5-413c-ab43-3b9142082c2f-runner\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.481682 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gqs\" (UniqueName: \"kubernetes.io/projected/4754eb2f-bab5-413c-ab43-3b9142082c2f-kube-api-access-t8gqs\") pod \"service-telemetry-operator-c87c48cb6-d4c8j\" (UID: \"4754eb2f-bab5-413c-ab43-3b9142082c2f\") " pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.531544 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" Mar 19 00:34:37 crc kubenswrapper[4745]: I0319 00:34:37.855485 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j"] Mar 19 00:34:38 crc kubenswrapper[4745]: I0319 00:34:38.759342 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" event={"ID":"4754eb2f-bab5-413c-ab43-3b9142082c2f","Type":"ContainerStarted","Data":"153b914866c339974932d1e6651395a6fad755df186120bfd49a7ff0cb7ff251"} Mar 19 00:34:43 crc kubenswrapper[4745]: I0319 00:34:43.138372 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:43 crc kubenswrapper[4745]: E0319 00:34:43.139566 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.180772 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/infrawatch/smart-gateway-operator:stable-1.5" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.181692 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/infrawatch/smart-gateway-operator:stable-1.5,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:WATCH_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.annotations['olm.targetNamespaces'],},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:OPERATOR_NAME,Value:smart-gateway-operator,ValueFrom:nil,},EnvVar{Name:ANSIBLE_GATHERING,Value:explicit,ValueFrom:nil,},EnvVar{Name:ANSIBLE_VERBOSITY_SMARTGATEWAY_SMARTGATEWAY_INFRA_WATCH,Value:4,ValueFrom:nil,},EnvVar{Name:ANSIBLE_DEBUG_LOGS,Value:true,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_CORE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-core:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_BRIDGE_SMARTGATEWAY_IMAGE,Value:image-registry.openshift-image-registry.svc:5000/service-telemetry/sg-bridge:latest,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OAUTH_PROXY_IMAGE,Value:quay.io/openshift/origin-oauth-proxy:latest,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:smart-gateway-operator.v5.0.1773880325,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:runner,ReadOnly:false,MountPath:/tmp/ansible-operator/runner,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68sr2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000670000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod smart-gateway-operator-ff5d8cc8d-tcxb4_service-telemetry(a8895e98-6a3f-4f8a-b671-9a7920ceb390): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.183273 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podUID="a8895e98-6a3f-4f8a-b671-9a7920ceb390" Mar 19 00:34:52 crc kubenswrapper[4745]: E0319 00:34:52.885005 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/infrawatch/smart-gateway-operator:stable-1.5\\\"\"" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podUID="a8895e98-6a3f-4f8a-b671-9a7920ceb390" Mar 19 00:34:55 crc kubenswrapper[4745]: I0319 00:34:55.138397 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:34:55 crc kubenswrapper[4745]: E0319 00:34:55.139092 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:34:57 crc kubenswrapper[4745]: I0319 00:34:57.934914 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" event={"ID":"4754eb2f-bab5-413c-ab43-3b9142082c2f","Type":"ContainerStarted","Data":"fd141a4b6978aac22d9aaa325a3052b0e666649d82df326a720549e63a18ef11"} Mar 19 00:34:57 crc kubenswrapper[4745]: I0319 00:34:57.956144 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/service-telemetry-operator-c87c48cb6-d4c8j" podStartSLOduration=2.041218155 podStartE2EDuration="20.956120722s" podCreationTimestamp="2026-03-19 00:34:37 +0000 UTC" firstStartedPulling="2026-03-19 00:34:37.886828181 +0000 UTC m=+1642.425023312" lastFinishedPulling="2026-03-19 00:34:56.801730748 +0000 UTC m=+1661.339925879" observedRunningTime="2026-03-19 00:34:57.954212752 +0000 UTC m=+1662.492407883" watchObservedRunningTime="2026-03-19 00:34:57.956120722 +0000 UTC m=+1662.494315843" Mar 19 00:35:07 crc kubenswrapper[4745]: I0319 00:35:07.137916 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:07 crc kubenswrapper[4745]: E0319 00:35:07.138763 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:09 crc kubenswrapper[4745]: I0319 00:35:09.011413 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" event={"ID":"a8895e98-6a3f-4f8a-b671-9a7920ceb390","Type":"ContainerStarted","Data":"84bedd6b6132ec1bae853ff3bc2524b11e0cf5fe5139bbbdb11993796612bce7"} Mar 19 00:35:09 crc kubenswrapper[4745]: I0319 00:35:09.035744 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/smart-gateway-operator-ff5d8cc8d-tcxb4" podStartSLOduration=1.063646116 podStartE2EDuration="35.035706428s" podCreationTimestamp="2026-03-19 00:34:34 +0000 UTC" firstStartedPulling="2026-03-19 00:34:34.627257528 +0000 UTC m=+1639.165452659" lastFinishedPulling="2026-03-19 00:35:08.59931784 +0000 UTC m=+1673.137512971" observedRunningTime="2026-03-19 00:35:09.031873797 +0000 UTC m=+1673.570068938" watchObservedRunningTime="2026-03-19 00:35:09.035706428 +0000 UTC m=+1673.573901559" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.331653 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.333209 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336435 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-users" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336562 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-ca" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.336782 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-credentials" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.337073 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-openstack-ca" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.337282 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-inter-router-credentials" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.339193 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-dockercfg-95txw" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.339539 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-interconnect-sasl-config" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.357836 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374007 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374333 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374433 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374504 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374589 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374691 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.374765 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477502 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477652 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477765 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477830 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.477952 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.478041 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.478063 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.480342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.487599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.487763 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488157 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488162 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.488266 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.498488 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"default-interconnect-68864d46cb-2x4w6\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:21 crc kubenswrapper[4745]: I0319 00:35:21.650078 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:35:23 crc kubenswrapper[4745]: I0319 00:35:22.137797 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:23 crc kubenswrapper[4745]: E0319 00:35:22.138468 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:23 crc kubenswrapper[4745]: I0319 00:35:23.866861 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:35:24 crc kubenswrapper[4745]: I0319 00:35:24.148351 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerStarted","Data":"dcfd75548cf8b7e9217b0b25a0988010bbd42dcd7ccfe6fcc357469ebb9d89b8"} Mar 19 00:35:29 crc kubenswrapper[4745]: I0319 00:35:29.192586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerStarted","Data":"6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36"} Mar 19 00:35:29 crc kubenswrapper[4745]: I0319 00:35:29.213281 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" podStartSLOduration=3.535487158 podStartE2EDuration="8.213251386s" podCreationTimestamp="2026-03-19 00:35:21 +0000 UTC" firstStartedPulling="2026-03-19 00:35:23.876833018 +0000 UTC m=+1688.415028189" lastFinishedPulling="2026-03-19 00:35:28.554597286 +0000 UTC m=+1693.092792417" observedRunningTime="2026-03-19 00:35:29.209849489 +0000 UTC m=+1693.748044630" watchObservedRunningTime="2026-03-19 00:35:29.213251386 +0000 UTC m=+1693.751446517" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.747539 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.749536 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.751753 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-2" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752139 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-stf-dockercfg-57c55" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752408 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752489 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-session-secret" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.752412 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-prometheus-proxy-tls" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753617 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753785 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-tls-assets-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.753981 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"serving-certs-ca-bundle" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.754295 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"prometheus-default-web-config" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.759497 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"prometheus-default-rulefiles-1" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.769486 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961080 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961164 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961193 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961217 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961239 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961266 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961333 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961353 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961471 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961534 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961658 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:32 crc kubenswrapper[4745]: I0319 00:35:32.961708 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063482 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063582 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063622 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063676 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063711 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063761 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063815 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063871 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.063932 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064169 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064196 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.064224 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.064738 4745 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.064852 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls podName:7bba4496-9224-467b-80ca-ff25c39604ec nodeName:}" failed. No retries permitted until 2026-03-19 00:35:33.564822974 +0000 UTC m=+1698.103018105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7bba4496-9224-467b-80ca-ff25c39604ec") : secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065472 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-1\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-2\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.065473 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-default-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-prometheus-default-rulefiles-0\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.066218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7bba4496-9224-467b-80ca-ff25c39604ec-configmap-serving-certs-ca-bundle\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.071020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.071508 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-web-config\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.072502 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-tls-assets\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073277 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7bba4496-9224-467b-80ca-ff25c39604ec-config-out\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073715 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.073758 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9e72b50503c51e0d02fdb78c6acc0817ced455338efa3523ed94a9d283c40373/globalmount\"" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.083254 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-session-secret\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.085666 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-958zv\" (UniqueName: \"kubernetes.io/projected/7bba4496-9224-467b-80ca-ff25c39604ec-kube-api-access-958zv\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.098779 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-30734426-5e46-4a5f-89a8-e69b85bfdd3d\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: I0319 00:35:33.572761 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.573004 4745 secret.go:188] Couldn't get secret service-telemetry/default-prometheus-proxy-tls: secret "default-prometheus-proxy-tls" not found Mar 19 00:35:33 crc kubenswrapper[4745]: E0319 00:35:33.573069 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls podName:7bba4496-9224-467b-80ca-ff25c39604ec nodeName:}" failed. No retries permitted until 2026-03-19 00:35:34.573049659 +0000 UTC m=+1699.111244790 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-prometheus-proxy-tls" (UniqueName: "kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls") pod "prometheus-default-0" (UID: "7bba4496-9224-467b-80ca-ff25c39604ec") : secret "default-prometheus-proxy-tls" not found Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.589877 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.617504 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-prometheus-proxy-tls\" (UniqueName: \"kubernetes.io/secret/7bba4496-9224-467b-80ca-ff25c39604ec-secret-default-prometheus-proxy-tls\") pod \"prometheus-default-0\" (UID: \"7bba4496-9224-467b-80ca-ff25c39604ec\") " pod="service-telemetry/prometheus-default-0" Mar 19 00:35:34 crc kubenswrapper[4745]: I0319 00:35:34.870263 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/prometheus-default-0" Mar 19 00:35:35 crc kubenswrapper[4745]: I0319 00:35:35.361163 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/prometheus-default-0"] Mar 19 00:35:36 crc kubenswrapper[4745]: I0319 00:35:36.265705 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"cf8cf1057b9dce4ad67641c5f9d14b2568c1ed26325186a786b78caa6d428710"} Mar 19 00:35:37 crc kubenswrapper[4745]: I0319 00:35:37.137232 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:37 crc kubenswrapper[4745]: E0319 00:35:37.137940 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:40 crc kubenswrapper[4745]: I0319 00:35:40.300967 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026"} Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.791896 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.793592 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.806186 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:43 crc kubenswrapper[4745]: I0319 00:35:43.935739 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.037056 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.058174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ssjs\" (UniqueName: \"kubernetes.io/projected/7ed8bcc5-1389-4cff-b64c-bf3b813f642e-kube-api-access-8ssjs\") pod \"default-snmp-webhook-6856cfb745-4pxj4\" (UID: \"7ed8bcc5-1389-4cff-b64c-bf3b813f642e\") " pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.114167 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" Mar 19 00:35:44 crc kubenswrapper[4745]: I0319 00:35:44.344281 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-snmp-webhook-6856cfb745-4pxj4"] Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.348431 4745 generic.go:334] "Generic (PLEG): container finished" podID="7bba4496-9224-467b-80ca-ff25c39604ec" containerID="c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026" exitCode=0 Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.348601 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerDied","Data":"c82a19278aef976ca00a2ceac0c95fb7154d152b86cb049420592bb81d699026"} Mar 19 00:35:45 crc kubenswrapper[4745]: I0319 00:35:45.351679 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" event={"ID":"7ed8bcc5-1389-4cff-b64c-bf3b813f642e","Type":"ContainerStarted","Data":"e78addcb3241e534311ac9cce6865e47d325db27204bf2125812d6ca0b948193"} Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.230575 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.235277 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.239717 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-stf-dockercfg-2vrkh" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240269 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-tls-assets-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240519 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-alertmanager-proxy-tls" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240685 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-generated" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.240835 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-cluster-tls-config" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.241770 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"alertmanager-default-web-config" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.245559 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399671 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399723 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399766 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399823 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399848 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.399866 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400016 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400059 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.400100 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501859 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501967 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.501998 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: E0319 00:35:47.502086 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:47 crc kubenswrapper[4745]: E0319 00:35:47.503067 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:48.003030383 +0000 UTC m=+1712.541225514 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503147 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.503922 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.512408 4745 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.512591 4745 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f44b85b436fe394314838612ea3934316cfd49736989031f5eaeb5e07b616f52/globalmount\"" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.513459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-tls-assets\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.513137 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-out\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.514004 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-session-secret\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-session-secret\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.514399 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-web-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.534170 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smmkd\" (UniqueName: \"kubernetes.io/projected/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-kube-api-access-smmkd\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.539208 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-config-volume\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.562638 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-cluster-tls-config\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:47 crc kubenswrapper[4745]: I0319 00:35:47.567938 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-739bd674-a203-4a24-850e-e18ec05bafe0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-739bd674-a203-4a24-850e-e18ec05bafe0\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:48 crc kubenswrapper[4745]: I0319 00:35:48.017558 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.017772 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.017833 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:49.017817293 +0000 UTC m=+1713.556012424 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:48 crc kubenswrapper[4745]: I0319 00:35:48.138728 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:35:48 crc kubenswrapper[4745]: E0319 00:35:48.139176 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:35:49 crc kubenswrapper[4745]: I0319 00:35:49.032206 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:49 crc kubenswrapper[4745]: E0319 00:35:49.032420 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:49 crc kubenswrapper[4745]: E0319 00:35:49.032749 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:51.032724424 +0000 UTC m=+1715.570919555 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:51 crc kubenswrapper[4745]: I0319 00:35:51.067808 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:51 crc kubenswrapper[4745]: E0319 00:35:51.068097 4745 secret.go:188] Couldn't get secret service-telemetry/default-alertmanager-proxy-tls: secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:51 crc kubenswrapper[4745]: E0319 00:35:51.068390 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls podName:0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7 nodeName:}" failed. No retries permitted until 2026-03-19 00:35:55.068366689 +0000 UTC m=+1719.606561820 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "secret-default-alertmanager-proxy-tls" (UniqueName: "kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls") pod "alertmanager-default-0" (UID: "0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7") : secret "default-alertmanager-proxy-tls" not found Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.135653 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.155827 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-default-alertmanager-proxy-tls\" (UniqueName: \"kubernetes.io/secret/0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7-secret-default-alertmanager-proxy-tls\") pod \"alertmanager-default-0\" (UID: \"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7\") " pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:55 crc kubenswrapper[4745]: I0319 00:35:55.370118 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/alertmanager-default-0" Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.198612 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/alertmanager-default-0"] Mar 19 00:35:56 crc kubenswrapper[4745]: W0319 00:35:56.202581 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a13ce6d_9eb6_4d56_adfd_7c51a426c4b7.slice/crio-b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232 WatchSource:0}: Error finding container b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232: Status 404 returned error can't find the container with id b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232 Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.452068 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"3ca731c08f48fe75ca37730ddc55af586eaf24640cfc01e5251547d8fe2dcf32"} Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.453788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" event={"ID":"7ed8bcc5-1389-4cff-b64c-bf3b813f642e","Type":"ContainerStarted","Data":"87f458034926bc2a409c50354a3eaeb72e25a722731956d92cfb42a63e9f3640"} Mar 19 00:35:56 crc kubenswrapper[4745]: I0319 00:35:56.456319 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"b539c240599a8a5df637c538d065d5f958eb9c22a81fd6d40b2ec11668d1c232"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.479542 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"0491f36c0bfcc256f95f552285dd5d8bf9df1e78fe12f7e0dc4b452a81746238"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.482212 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902"} Mar 19 00:35:59 crc kubenswrapper[4745]: I0319 00:35:59.509833 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-snmp-webhook-6856cfb745-4pxj4" podStartSLOduration=5.385977137 podStartE2EDuration="16.509788367s" podCreationTimestamp="2026-03-19 00:35:43 +0000 UTC" firstStartedPulling="2026-03-19 00:35:44.354547593 +0000 UTC m=+1708.892742724" lastFinishedPulling="2026-03-19 00:35:55.478358823 +0000 UTC m=+1720.016553954" observedRunningTime="2026-03-19 00:35:56.468727416 +0000 UTC m=+1721.006922547" watchObservedRunningTime="2026-03-19 00:35:59.509788367 +0000 UTC m=+1724.047983508" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.140060 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:00 crc kubenswrapper[4745]: E0319 00:36:00.140836 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.171337 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.172124 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.172225 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175338 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175567 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.175735 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.217968 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.322302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.361760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"auto-csr-approver-29564676-6jsmm\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.500192 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:00 crc kubenswrapper[4745]: I0319 00:36:00.997205 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:36:01 crc kubenswrapper[4745]: I0319 00:36:01.508278 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerStarted","Data":"70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3"} Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.848407 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.854040 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862138 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-meter-sg-core-configmap" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862189 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-coll-meter-proxy-tls" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.862464 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-dockercfg-7zvhj" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.863030 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"smart-gateway-session-secret" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.878711 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972568 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972651 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972795 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972841 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:02 crc kubenswrapper[4745]: I0319 00:36:02.972927 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074468 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074568 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074705 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074747 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.074820 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.075090 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/71276875-43be-4d09-a25d-4327369c3a53-socket-dir\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.075331 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.075460 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls podName:71276875-43be-4d09-a25d-4327369c3a53 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:03.575429792 +0000 UTC m=+1728.113625103 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" (UID: "71276875-43be-4d09-a25d-4327369c3a53") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.076635 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/71276875-43be-4d09-a25d-4327369c3a53-sg-core-config\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.091238 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-session-secret\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.097777 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qqj8\" (UniqueName: \"kubernetes.io/projected/71276875-43be-4d09-a25d-4327369c3a53-kube-api-access-5qqj8\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: I0319 00:36:03.590742 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.591024 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-coll-meter-proxy-tls: secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:03 crc kubenswrapper[4745]: E0319 00:36:03.591385 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls podName:71276875-43be-4d09-a25d-4327369c3a53 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:04.591356898 +0000 UTC m=+1729.129552029 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-coll-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls") pod "default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" (UID: "71276875-43be-4d09-a25d-4327369c3a53") : secret "default-cloud1-coll-meter-proxy-tls" not found Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.531833 4745 generic.go:334] "Generic (PLEG): container finished" podID="0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7" containerID="11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902" exitCode=0 Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.531903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerDied","Data":"11c715d2f9fac89bb8a3a7830b902ae3132a5417eb1f5a0f573327daa0282902"} Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.610252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.615755 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-coll-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/71276875-43be-4d09-a25d-4327369c3a53-default-cloud1-coll-meter-proxy-tls\") pod \"default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg\" (UID: \"71276875-43be-4d09-a25d-4327369c3a53\") " pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:04 crc kubenswrapper[4745]: I0319 00:36:04.685245 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.328276 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.336019 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.339391 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-ceil-meter-proxy-tls" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.339609 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-meter-sg-core-configmap" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.343177 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439548 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439626 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439653 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439693 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.439713 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541247 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541767 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541805 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541869 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.541917 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: E0319 00:36:06.542071 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:06 crc kubenswrapper[4745]: E0319 00:36:06.542209 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls podName:aa12986d-ffa2-4a08-9069-77fc4fdd80c6 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:07.042162886 +0000 UTC m=+1731.580358187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" (UID: "aa12986d-ffa2-4a08-9069-77fc4fdd80c6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.542271 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-socket-dir\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.543459 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-sg-core-config\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.558083 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-session-secret\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.565915 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzbqf\" (UniqueName: \"kubernetes.io/projected/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-kube-api-access-rzbqf\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.597442 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/prometheus-default-0" event={"ID":"7bba4496-9224-467b-80ca-ff25c39604ec","Type":"ContainerStarted","Data":"14115b4c7fc81ef3247bc1ed47b86b4e33a0eb9ca5358e50ae275dbb80cdfd00"} Mar 19 00:36:06 crc kubenswrapper[4745]: I0319 00:36:06.629775 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg"] Mar 19 00:36:06 crc kubenswrapper[4745]: W0319 00:36:06.991252 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71276875_43be_4d09_a25d_4327369c3a53.slice/crio-e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a WatchSource:0}: Error finding container e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a: Status 404 returned error can't find the container with id e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.049302 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:07 crc kubenswrapper[4745]: E0319 00:36:07.049536 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-ceil-meter-proxy-tls: secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:07 crc kubenswrapper[4745]: E0319 00:36:07.049648 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls podName:aa12986d-ffa2-4a08-9069-77fc4fdd80c6 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:08.049619057 +0000 UTC m=+1732.587814188 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-ceil-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls") pod "default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" (UID: "aa12986d-ffa2-4a08-9069-77fc4fdd80c6") : secret "default-cloud1-ceil-meter-proxy-tls" not found Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.607051 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"e07c38ba1bac5ae91b146d8b828f53be2bf680707632b11a484bd2648a17362a"} Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.609392 4745 generic.go:334] "Generic (PLEG): container finished" podID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerID="4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e" exitCode=0 Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.609497 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerDied","Data":"4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e"} Mar 19 00:36:07 crc kubenswrapper[4745]: I0319 00:36:07.625448 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/prometheus-default-0" podStartSLOduration=5.775374703 podStartE2EDuration="36.625423395s" podCreationTimestamp="2026-03-19 00:35:31 +0000 UTC" firstStartedPulling="2026-03-19 00:35:35.350558811 +0000 UTC m=+1699.888753942" lastFinishedPulling="2026-03-19 00:36:06.200607503 +0000 UTC m=+1730.738802634" observedRunningTime="2026-03-19 00:36:06.641248304 +0000 UTC m=+1731.179443435" watchObservedRunningTime="2026-03-19 00:36:07.625423395 +0000 UTC m=+1732.163618526" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.074252 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.081596 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-ceil-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/aa12986d-ffa2-4a08-9069-77fc4fdd80c6-default-cloud1-ceil-meter-proxy-tls\") pod \"default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6\" (UID: \"aa12986d-ffa2-4a08-9069-77fc4fdd80c6\") " pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.163641 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.611815 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6"] Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.621988 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"3cd590c42af1c4dd91a1a45086b1a788efa5679fd559921573aa95fc57e49918"} Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.623926 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"5fb4a33900be022415bf042c720b7ab321b0b8282f6497c694b151ca05ad4fd6"} Mar 19 00:36:08 crc kubenswrapper[4745]: W0319 00:36:08.646646 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa12986d_ffa2_4a08_9069_77fc4fdd80c6.slice/crio-eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8 WatchSource:0}: Error finding container eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8: Status 404 returned error can't find the container with id eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8 Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.863281 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.897548 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") pod \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\" (UID: \"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60\") " Mar 19 00:36:08 crc kubenswrapper[4745]: I0319 00:36:08.904591 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2" (OuterVolumeSpecName: "kube-api-access-ctsf2") pod "b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" (UID: "b28c36f9-bd14-4de4-a0b3-e3f5e9131f60"). InnerVolumeSpecName "kube-api-access-ctsf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:08.999597 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctsf2\" (UniqueName: \"kubernetes.io/projected/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60-kube-api-access-ctsf2\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632647 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632652 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564676-6jsmm" event={"ID":"b28c36f9-bd14-4de4-a0b3-e3f5e9131f60","Type":"ContainerDied","Data":"70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3"} Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.632819 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70fe04024e13a78c92725ba2469acbf2db19f7926717f6ee559cb02f5b6394d3" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.633824 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"eb6c1d5653b9a9ca68527a19608f429de46ae832c7cf61b1326d14c93e9faaa8"} Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.870935 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.941039 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:36:09 crc kubenswrapper[4745]: I0319 00:36:09.947715 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564670-8sw74"] Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.145914 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a566f97-13b9-4fde-868a-f55bd82a1af6" path="/var/lib/kubelet/pods/9a566f97-13b9-4fde-868a-f55bd82a1af6/volumes" Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.645964 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.655833 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.655932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"8ad5b98b5dda413ed20a29983acc0b2407844b57318959de371275cfcb8944ac"} Mar 19 00:36:10 crc kubenswrapper[4745]: I0319 00:36:10.659135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"93d8d19ccb8cad93714b317ffb236f4f51edc43950a55d0e225f5ed61d3f97a7"} Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.274758 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.275211 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.275238 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.275406 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" containerName="oc" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.276741 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.280312 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-sens-meter-sg-core-configmap" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.280611 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-cloud1-sens-meter-proxy-tls" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.288688 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337170 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337264 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337317 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.337408 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.438920 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439009 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439055 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439080 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439124 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.439336 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.439464 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls podName:651a6724-09fd-4395-859f-7fdff0781163 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:11.939432159 +0000 UTC m=+1736.477627290 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" (UID: "651a6724-09fd-4395-859f-7fdff0781163") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.439976 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/651a6724-09fd-4395-859f-7fdff0781163-socket-dir\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.440302 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/651a6724-09fd-4395-859f-7fdff0781163-sg-core-config\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.462670 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czp9r\" (UniqueName: \"kubernetes.io/projected/651a6724-09fd-4395-859f-7fdff0781163-kube-api-access-czp9r\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.463280 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"session-secret\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-session-secret\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.674756 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/alertmanager-default-0" event={"ID":"0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7","Type":"ContainerStarted","Data":"76c77b4195939436d825229401dfff783f924882281e352d295e60a3c2c38635"} Mar 19 00:36:11 crc kubenswrapper[4745]: I0319 00:36:11.949669 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.951471 4745 secret.go:188] Couldn't get secret service-telemetry/default-cloud1-sens-meter-proxy-tls: secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:11 crc kubenswrapper[4745]: E0319 00:36:11.951567 4745 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls podName:651a6724-09fd-4395-859f-7fdff0781163 nodeName:}" failed. No retries permitted until 2026-03-19 00:36:12.951520084 +0000 UTC m=+1737.489715215 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "default-cloud1-sens-meter-proxy-tls" (UniqueName: "kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls") pod "default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" (UID: "651a6724-09fd-4395-859f-7fdff0781163") : secret "default-cloud1-sens-meter-proxy-tls" not found Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.137955 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:12 crc kubenswrapper[4745]: E0319 00:36:12.138231 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.969707 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:12 crc kubenswrapper[4745]: I0319 00:36:12.977018 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-cloud1-sens-meter-proxy-tls\" (UniqueName: \"kubernetes.io/secret/651a6724-09fd-4395-859f-7fdff0781163-default-cloud1-sens-meter-proxy-tls\") pod \"default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl\" (UID: \"651a6724-09fd-4395-859f-7fdff0781163\") " pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:13 crc kubenswrapper[4745]: I0319 00:36:13.120357 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.174427 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/alertmanager-default-0" podStartSLOduration=22.614174917 podStartE2EDuration="29.17440589s" podCreationTimestamp="2026-03-19 00:35:46 +0000 UTC" firstStartedPulling="2026-03-19 00:36:04.534785354 +0000 UTC m=+1729.072980485" lastFinishedPulling="2026-03-19 00:36:11.095016327 +0000 UTC m=+1735.633211458" observedRunningTime="2026-03-19 00:36:11.707793082 +0000 UTC m=+1736.245988243" watchObservedRunningTime="2026-03-19 00:36:15.17440589 +0000 UTC m=+1739.712601021" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.180436 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl"] Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.718419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"5c3373afb0e47ceff526faab0bd50eaca0309fc30b318776cfe25d7bfa420547"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.720461 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"e855459e41a66e7cd0597ff7c75b2951087cdc755d5e4b354ce12c2fde8728e6"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.722331 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"00d724fb7267f4bd8b88cf89d7e346caef43a54f15860ea5c23aef5b049931a8"} Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.768496 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" podStartSLOduration=6.087772744 podStartE2EDuration="13.76846798s" podCreationTimestamp="2026-03-19 00:36:02 +0000 UTC" firstStartedPulling="2026-03-19 00:36:06.99669713 +0000 UTC m=+1731.534892261" lastFinishedPulling="2026-03-19 00:36:14.677392366 +0000 UTC m=+1739.215587497" observedRunningTime="2026-03-19 00:36:15.744740298 +0000 UTC m=+1740.282935419" watchObservedRunningTime="2026-03-19 00:36:15.76846798 +0000 UTC m=+1740.306663101" Mar 19 00:36:15 crc kubenswrapper[4745]: I0319 00:36:15.769673 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" podStartSLOduration=3.6944758540000002 podStartE2EDuration="9.769663177s" podCreationTimestamp="2026-03-19 00:36:06 +0000 UTC" firstStartedPulling="2026-03-19 00:36:08.650108692 +0000 UTC m=+1733.188303823" lastFinishedPulling="2026-03-19 00:36:14.725296015 +0000 UTC m=+1739.263491146" observedRunningTime="2026-03-19 00:36:15.763958609 +0000 UTC m=+1740.302153760" watchObservedRunningTime="2026-03-19 00:36:15.769663177 +0000 UTC m=+1740.307858308" Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734323 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"b715fd563da2fd37e54c08026dd1465560a32dc988ef695b1f663c83ecd1a0ba"} Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734814 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b"} Mar 19 00:36:16 crc kubenswrapper[4745]: I0319 00:36:16.734834 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"69def65840cec58199c78c09533c14fbf3ae19b610fc97a7d7bfa870ad7cc13f"} Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.409595 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" podStartSLOduration=7.418241052 podStartE2EDuration="8.409567726s" podCreationTimestamp="2026-03-19 00:36:11 +0000 UTC" firstStartedPulling="2026-03-19 00:36:15.191226156 +0000 UTC m=+1739.729421287" lastFinishedPulling="2026-03-19 00:36:16.18255284 +0000 UTC m=+1740.720747961" observedRunningTime="2026-03-19 00:36:16.762256131 +0000 UTC m=+1741.300451262" watchObservedRunningTime="2026-03-19 00:36:19.409567726 +0000 UTC m=+1743.947762878" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.413345 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.415016 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.418185 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-cert" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.418577 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-coll-event-sg-core-configmap" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.423231 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.574911 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575375 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575628 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.575702 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.676866 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677423 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677492 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.677531 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/1d506c7b-246f-4aca-b3ac-635dbc53b579-socket-dir\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.678599 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/1d506c7b-246f-4aca-b3ac-635dbc53b579-sg-core-config\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.686560 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/1d506c7b-246f-4aca-b3ac-635dbc53b579-elastic-certs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.700641 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlkvs\" (UniqueName: \"kubernetes.io/projected/1d506c7b-246f-4aca-b3ac-635dbc53b579-kube-api-access-xlkvs\") pod \"default-cloud1-coll-event-smartgateway-7f44857896-xgs4c\" (UID: \"1d506c7b-246f-4aca-b3ac-635dbc53b579\") " pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.735398 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.871311 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:19 crc kubenswrapper[4745]: I0319 00:36:19.927177 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.250762 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.766324 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"d1da666fbdebf65ceda1716f591828f9d729c6cab2fd5294e467d4cf6d41ddd9"} Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.812211 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/prometheus-default-0" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.972622 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.975151 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.984006 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:20 crc kubenswrapper[4745]: I0319 00:36:20.988347 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"default-cloud1-ceil-event-sg-core-configmap" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114675 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114764 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114798 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.114844 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216824 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216942 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.216985 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.217029 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.218790 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/f88023b4-4c23-4946-a12b-3f0cdab93771-socket-dir\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.218988 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-config\" (UniqueName: \"kubernetes.io/configmap/f88023b4-4c23-4946-a12b-3f0cdab93771-sg-core-config\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.237720 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-certs\" (UniqueName: \"kubernetes.io/secret/f88023b4-4c23-4946-a12b-3f0cdab93771-elastic-certs\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.241554 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbvmf\" (UniqueName: \"kubernetes.io/projected/f88023b4-4c23-4946-a12b-3f0cdab93771-kube-api-access-jbvmf\") pod \"default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw\" (UID: \"f88023b4-4c23-4946-a12b-3f0cdab93771\") " pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:21 crc kubenswrapper[4745]: I0319 00:36:21.322703 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" Mar 19 00:36:26 crc kubenswrapper[4745]: I0319 00:36:26.149133 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:26 crc kubenswrapper[4745]: E0319 00:36:26.150288 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:29 crc kubenswrapper[4745]: I0319 00:36:29.100296 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw"] Mar 19 00:36:29 crc kubenswrapper[4745]: I0319 00:36:29.164800 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"cd1becfe8def766c68f51f1f70e281ff8b63a047134e521aeccb27fcb24da6af"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.174943 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"5db347cbdc2cb16a52036d40008b6863e1e50d39a28cedc5cbcc6eaf56324ba9"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.175786 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.177710 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"2af68900592a904c84b38f8c758a12ed12be59dab65307542ab1213e16f32357"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.177761 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2"} Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.202000 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" podStartSLOduration=1.7290795829999999 podStartE2EDuration="11.201971569s" podCreationTimestamp="2026-03-19 00:36:19 +0000 UTC" firstStartedPulling="2026-03-19 00:36:20.247247804 +0000 UTC m=+1744.785442935" lastFinishedPulling="2026-03-19 00:36:29.72013979 +0000 UTC m=+1754.258334921" observedRunningTime="2026-03-19 00:36:30.197712655 +0000 UTC m=+1754.735907796" watchObservedRunningTime="2026-03-19 00:36:30.201971569 +0000 UTC m=+1754.740166720" Mar 19 00:36:30 crc kubenswrapper[4745]: I0319 00:36:30.217707 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" podStartSLOduration=9.396766983 podStartE2EDuration="10.21768191s" podCreationTimestamp="2026-03-19 00:36:20 +0000 UTC" firstStartedPulling="2026-03-19 00:36:29.106242906 +0000 UTC m=+1753.644438037" lastFinishedPulling="2026-03-19 00:36:29.927157833 +0000 UTC m=+1754.465352964" observedRunningTime="2026-03-19 00:36:30.214757799 +0000 UTC m=+1754.752952930" watchObservedRunningTime="2026-03-19 00:36:30.21768191 +0000 UTC m=+1754.755877041" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.693509 4745 scope.go:117] "RemoveContainer" containerID="e1e89ee6fc2c85074b56c8f19c7bf183b3c352108812ec9dcafde77f229e8ca5" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.742106 4745 scope.go:117] "RemoveContainer" containerID="ce8a63b6903edaf1dfec62801e62f011b41ac609121cec118da8bcbd296b697b" Mar 19 00:36:35 crc kubenswrapper[4745]: I0319 00:36:35.776201 4745 scope.go:117] "RemoveContainer" containerID="649af08f705b85b72e2b308ff29da2e7d5edce2a304ca0c563098fa2b731a46b" Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.137892 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:37 crc kubenswrapper[4745]: E0319 00:36:37.138664 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.777869 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:37 crc kubenswrapper[4745]: I0319 00:36:37.778242 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" containerID="cri-o://6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" gracePeriod=30 Mar 19 00:36:38 crc kubenswrapper[4745]: I0319 00:36:38.665211 4745 generic.go:334] "Generic (PLEG): container finished" podID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerID="6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" exitCode=0 Mar 19 00:36:38 crc kubenswrapper[4745]: I0319 00:36:38.665716 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerDied","Data":"6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.009867 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049068 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: E0319 00:36:39.049403 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049421 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.049539 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" containerName="default-interconnect" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.050100 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.087097 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130312 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130585 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130666 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130727 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.130779 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") pod \"3c93cb13-815f-48ec-a316-a889a6717f7c\" (UID: \"3c93cb13-815f-48ec-a316-a889a6717f7c\") " Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.131681 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config" (OuterVolumeSpecName: "sasl-config") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "sasl-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.137188 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users" (OuterVolumeSpecName: "sasl-users") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "sasl-users". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.137548 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca" (OuterVolumeSpecName: "default-interconnect-openstack-ca") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-openstack-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.138274 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62" (OuterVolumeSpecName: "kube-api-access-29b62") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "kube-api-access-29b62". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.138729 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials" (OuterVolumeSpecName: "default-interconnect-openstack-credentials") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-openstack-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.139184 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca" (OuterVolumeSpecName: "default-interconnect-inter-router-ca") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-inter-router-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.140928 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials" (OuterVolumeSpecName: "default-interconnect-inter-router-credentials") pod "3c93cb13-815f-48ec-a316-a889a6717f7c" (UID: "3c93cb13-815f-48ec-a316-a889a6717f7c"). InnerVolumeSpecName "default-interconnect-inter-router-credentials". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232383 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232861 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.232905 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233051 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233129 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233163 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233203 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233295 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29b62\" (UniqueName: \"kubernetes.io/projected/3c93cb13-815f-48ec-a316-a889a6717f7c-kube-api-access-29b62\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233308 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-credentials\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233321 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-openstack-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233332 4745 reconciler_common.go:293] "Volume detached for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-users\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233342 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-ca\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233351 4745 reconciler_common.go:293] "Volume detached for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/3c93cb13-815f-48ec-a316-a889a6717f7c-sasl-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.233362 4745 reconciler_common.go:293] "Volume detached for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/3c93cb13-815f-48ec-a316-a889a6717f7c-default-interconnect-inter-router-credentials\") on node \"crc\" DevicePath \"\"" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334402 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334514 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334550 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334575 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334678 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334731 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.334752 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.335817 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-config\" (UniqueName: \"kubernetes.io/configmap/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-config\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.339342 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.341218 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-inter-router-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-inter-router-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.341604 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-ca\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-ca\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.347157 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sasl-users\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-sasl-users\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.351760 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-openstack-credentials\" (UniqueName: \"kubernetes.io/secret/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-default-interconnect-openstack-credentials\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.353980 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bkh9\" (UniqueName: \"kubernetes.io/projected/b58f3451-d46e-43bc-8d65-cb9abbc9de0d-kube-api-access-9bkh9\") pod \"default-interconnect-68864d46cb-wgvm9\" (UID: \"b58f3451-d46e-43bc-8d65-cb9abbc9de0d\") " pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.374306 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.682394 4745 generic.go:334] "Generic (PLEG): container finished" podID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.682550 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerDied","Data":"c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.683567 4745 scope.go:117] "RemoveContainer" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.690415 4745 generic.go:334] "Generic (PLEG): container finished" podID="1d506c7b-246f-4aca-b3ac-635dbc53b579" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.690500 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerDied","Data":"cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.691244 4745 scope.go:117] "RemoveContainer" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.691820 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-wgvm9"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.700407 4745 generic.go:334] "Generic (PLEG): container finished" podID="f88023b4-4c23-4946-a12b-3f0cdab93771" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.700572 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerDied","Data":"46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.701328 4745 scope.go:117] "RemoveContainer" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.725702 4745 generic.go:334] "Generic (PLEG): container finished" podID="71276875-43be-4d09-a25d-4327369c3a53" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.726972 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerDied","Data":"d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.729972 4745 scope.go:117] "RemoveContainer" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.745792 4745 generic.go:334] "Generic (PLEG): container finished" podID="651a6724-09fd-4395-859f-7fdff0781163" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" exitCode=0 Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.745927 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerDied","Data":"76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.746592 4745 scope.go:117] "RemoveContainer" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750771 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" event={"ID":"3c93cb13-815f-48ec-a316-a889a6717f7c","Type":"ContainerDied","Data":"dcfd75548cf8b7e9217b0b25a0988010bbd42dcd7ccfe6fcc357469ebb9d89b8"} Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750817 4745 scope.go:117] "RemoveContainer" containerID="6b68238373df0047ebeb6c52001a21e2ba8abdae07e988851a91457251aa4b36" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.750951 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/default-interconnect-68864d46cb-2x4w6" Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.923629 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:39 crc kubenswrapper[4745]: I0319 00:36:39.929753 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/default-interconnect-68864d46cb-2x4w6"] Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.150558 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c93cb13-815f-48ec-a316-a889a6717f7c" path="/var/lib/kubelet/pods/3c93cb13-815f-48ec-a316-a889a6717f7c/volumes" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.762702 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.786192 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" event={"ID":"b58f3451-d46e-43bc-8d65-cb9abbc9de0d","Type":"ContainerStarted","Data":"f32e002f0611eab40775ff09d1f7ff65af0d64799eba82d8809c019e2824ffe2"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.786259 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" event={"ID":"b58f3451-d46e-43bc-8d65-cb9abbc9de0d","Type":"ContainerStarted","Data":"66841ffae63083890e097a1f39b37e2b0215ed6f7b659978ae650f60a5f5f83c"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.791516 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.796972 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.807118 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.809104 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527"} Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.834587 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/default-interconnect-68864d46cb-wgvm9" podStartSLOduration=3.834559741 podStartE2EDuration="3.834559741s" podCreationTimestamp="2026-03-19 00:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 00:36:40.82078679 +0000 UTC m=+1765.358981921" watchObservedRunningTime="2026-03-19 00:36:40.834559741 +0000 UTC m=+1765.372754872" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.926803 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.929085 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.932455 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"qdr-test-config" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.932775 4745 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-interconnect-selfsigned" Mar 19 00:36:40 crc kubenswrapper[4745]: I0319 00:36:40.938240 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068116 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068212 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.068370 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170457 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170538 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.170609 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.171622 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"qdr-test-config\" (UniqueName: \"kubernetes.io/configmap/9bc30d38-f540-480c-9289-45dbe7a4401b-qdr-test-config\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.179061 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-interconnect-selfsigned-cert\" (UniqueName: \"kubernetes.io/secret/9bc30d38-f540-480c-9289-45dbe7a4401b-default-interconnect-selfsigned-cert\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.190697 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lltwb\" (UniqueName: \"kubernetes.io/projected/9bc30d38-f540-480c-9289-45dbe7a4401b-kube-api-access-lltwb\") pod \"qdr-test\" (UID: \"9bc30d38-f540-480c-9289-45dbe7a4401b\") " pod="service-telemetry/qdr-test" Mar 19 00:36:41 crc kubenswrapper[4745]: I0319 00:36:41.438142 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/qdr-test" Mar 19 00:36:42 crc kubenswrapper[4745]: W0319 00:36:42.364804 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc30d38_f540_480c_9289_45dbe7a4401b.slice/crio-baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862 WatchSource:0}: Error finding container baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862: Status 404 returned error can't find the container with id baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.372715 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/qdr-test"] Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878647 4745 generic.go:334] "Generic (PLEG): container finished" podID="f88023b4-4c23-4946-a12b-3f0cdab93771" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878760 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerDied","Data":"6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.878827 4745 scope.go:117] "RemoveContainer" containerID="46eca11823ff9b357cffb37437e48eab6984b261c9db2831eaf601f907a099e2" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.880620 4745 scope.go:117] "RemoveContainer" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.881040 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_service-telemetry(f88023b4-4c23-4946-a12b-3f0cdab93771)\"" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" podUID="f88023b4-4c23-4946-a12b-3f0cdab93771" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.892943 4745 generic.go:334] "Generic (PLEG): container finished" podID="71276875-43be-4d09-a25d-4327369c3a53" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.893013 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerDied","Data":"12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.893931 4745 scope.go:117] "RemoveContainer" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.894197 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_service-telemetry(71276875-43be-4d09-a25d-4327369c3a53)\"" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" podUID="71276875-43be-4d09-a25d-4327369c3a53" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.909572 4745 generic.go:334] "Generic (PLEG): container finished" podID="651a6724-09fd-4395-859f-7fdff0781163" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.910008 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerDied","Data":"f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.910834 4745 scope.go:117] "RemoveContainer" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.911104 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_service-telemetry(651a6724-09fd-4395-859f-7fdff0781163)\"" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" podUID="651a6724-09fd-4395-859f-7fdff0781163" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.911632 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"9bc30d38-f540-480c-9289-45dbe7a4401b","Type":"ContainerStarted","Data":"baee122dd91d198c13e90105d156424a7feb3df81f806b411ca841f89e02c862"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913395 4745 generic.go:334] "Generic (PLEG): container finished" podID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913432 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerDied","Data":"386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.913785 4745 scope.go:117] "RemoveContainer" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.914034 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_service-telemetry(aa12986d-ffa2-4a08-9069-77fc4fdd80c6)\"" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" podUID="aa12986d-ffa2-4a08-9069-77fc4fdd80c6" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.922747 4745 generic.go:334] "Generic (PLEG): container finished" podID="1d506c7b-246f-4aca-b3ac-635dbc53b579" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" exitCode=0 Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.922817 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerDied","Data":"155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527"} Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.923555 4745 scope.go:117] "RemoveContainer" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" Mar 19 00:36:42 crc kubenswrapper[4745]: E0319 00:36:42.923837 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"bridge\" with CrashLoopBackOff: \"back-off 10s restarting failed container=bridge pod=default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_service-telemetry(1d506c7b-246f-4aca-b3ac-635dbc53b579)\"" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" podUID="1d506c7b-246f-4aca-b3ac-635dbc53b579" Mar 19 00:36:42 crc kubenswrapper[4745]: I0319 00:36:42.965905 4745 scope.go:117] "RemoveContainer" containerID="d06419cf324315fc94fd28fa12b688584ca303ebd7cd1fabeb84325363ee24a7" Mar 19 00:36:43 crc kubenswrapper[4745]: I0319 00:36:43.941206 4745 scope.go:117] "RemoveContainer" containerID="76c6559d5164fbe901447e34e84a6f2a925b8533c5eaa0bf05828b32a7fd485b" Mar 19 00:36:44 crc kubenswrapper[4745]: I0319 00:36:44.767234 4745 scope.go:117] "RemoveContainer" containerID="c09f6fd57b4510a6536ce6cede5c5d0507077eb14b1eeafd9132c5c0313f2a67" Mar 19 00:36:44 crc kubenswrapper[4745]: I0319 00:36:44.811150 4745 scope.go:117] "RemoveContainer" containerID="cfb887723d4d194f268766af8ee4a172a3ee0c93bac5baa127b409ccb719c01a" Mar 19 00:36:52 crc kubenswrapper[4745]: I0319 00:36:52.138200 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:36:52 crc kubenswrapper[4745]: E0319 00:36:52.139280 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:36:54 crc kubenswrapper[4745]: I0319 00:36:54.138071 4745 scope.go:117] "RemoveContainer" containerID="6110402550d2e45a3d9817243885ca4b9910cf2a5b4ad8e1a58a9ed6c6bf7d17" Mar 19 00:36:55 crc kubenswrapper[4745]: I0319 00:36:55.138015 4745 scope.go:117] "RemoveContainer" containerID="155999ef7533a1bd0f394a36bb5200b2961a6d65dd5c2443e0fa2cc816f26527" Mar 19 00:36:56 crc kubenswrapper[4745]: I0319 00:36:56.142576 4745 scope.go:117] "RemoveContainer" containerID="f29f4c0c2b57bfa2be0cd49debb6f1e3ddbea4941fa8e360cbda5af079fd4376" Mar 19 00:36:56 crc kubenswrapper[4745]: I0319 00:36:56.143364 4745 scope.go:117] "RemoveContainer" containerID="12e23535161647cd463afdcfec6ad96399fa5bb6532d42e03b75fc393b100ecf" Mar 19 00:36:57 crc kubenswrapper[4745]: I0319 00:36:57.139322 4745 scope.go:117] "RemoveContainer" containerID="386eaf76e4df5175ffae4ffdc6f171e30f796b0ffdf10c3221b9a1b9d24969d5" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.471713 4745 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.472335 4745 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:qdr,Image:quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo,Command:[/usr/sbin/qdrouterd -c /etc/qpid-dispatch/qdrouterd.conf],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:amqp,HostPort:0,ContainerPort:5672,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:default-interconnect-selfsigned-cert,ReadOnly:false,MountPath:/etc/pki/tls/certs/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:qdr-test-config,ReadOnly:false,MountPath:/etc/qpid-dispatch/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lltwb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod qdr-test_service-telemetry(9bc30d38-f540-480c-9289-45dbe7a4401b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 00:36:58 crc kubenswrapper[4745]: E0319 00:36:58.473876 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"qdr\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="service-telemetry/qdr-test" podUID="9bc30d38-f540-480c-9289-45dbe7a4401b" Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.441584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg" event={"ID":"71276875-43be-4d09-a25d-4327369c3a53","Type":"ContainerStarted","Data":"8585c2a55e2f852cccafc303b1e6bd5e28e4d00f6ce8d9a1261afbc1fe83f17c"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.444873 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl" event={"ID":"651a6724-09fd-4395-859f-7fdff0781163","Type":"ContainerStarted","Data":"50de5a905defabe2101a54ad4ef7569e539363ead6bc9ab9a65fd3c979cad523"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.448261 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6" event={"ID":"aa12986d-ffa2-4a08-9069-77fc4fdd80c6","Type":"ContainerStarted","Data":"24b1d3c893b308ebe909fade6a32dbc0ac0c99873d1cbd61e4d0b4f1dcd4b1c4"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.451302 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-coll-event-smartgateway-7f44857896-xgs4c" event={"ID":"1d506c7b-246f-4aca-b3ac-635dbc53b579","Type":"ContainerStarted","Data":"628e3e976e83bbd945045b092fe5512fde789095faa7e3dddf38c9f347dd4847"} Mar 19 00:36:59 crc kubenswrapper[4745]: I0319 00:36:59.454419 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw" event={"ID":"f88023b4-4c23-4946-a12b-3f0cdab93771","Type":"ContainerStarted","Data":"34fe18c9b12f936b1b91f114837dc9407930c4bd386f12593e5324d4dbb29032"} Mar 19 00:36:59 crc kubenswrapper[4745]: E0319 00:36:59.458962 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"qdr\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/tripleowallabycentos9/openstack-qdrouterd:current-tripleo\\\"\"" pod="service-telemetry/qdr-test" podUID="9bc30d38-f540-480c-9289-45dbe7a4401b" Mar 19 00:37:06 crc kubenswrapper[4745]: I0319 00:37:06.143020 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:06 crc kubenswrapper[4745]: E0319 00:37:06.144162 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.381012 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.391435 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400012 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-healthcheck-log" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400121 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-entrypoint-script" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400449 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-entrypoint-script" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.400777 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-sensubility-config" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.401085 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.403141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-ceilometer-publisher" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.403141 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"stf-smoketest-collectd-config" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.562857 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563276 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563411 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563511 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563604 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563694 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.563786 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.666999 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667183 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667256 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667400 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667466 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.667646 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669020 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669123 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669200 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669565 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.669945 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.670210 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.694052 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"stf-smoketest-smoke1-9prq4\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.723402 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.727445 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.731694 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.735336 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.871172 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:11 crc kubenswrapper[4745]: I0319 00:37:11.973308 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.001082 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"curl\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.127352 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.291004 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/stf-smoketest-smoke1-9prq4"] Mar 19 00:37:12 crc kubenswrapper[4745]: W0319 00:37:12.321684 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7de37e62_8066_4f02_85a3_4490078b4007.slice/crio-e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184 WatchSource:0}: Error finding container e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184: Status 404 returned error can't find the container with id e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184 Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.533815 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/curl"] Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.565665 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.567781 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerStarted","Data":"739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.569731 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/qdr-test" event={"ID":"9bc30d38-f540-480c-9289-45dbe7a4401b","Type":"ContainerStarted","Data":"39c63c6437ed94b284a0ebbfdee9d766b5057b6cdd46293b5d9dd99cbb801f7c"} Mar 19 00:37:12 crc kubenswrapper[4745]: I0319 00:37:12.597256 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/qdr-test" podStartSLOduration=3.338983274 podStartE2EDuration="32.597226327s" podCreationTimestamp="2026-03-19 00:36:40 +0000 UTC" firstStartedPulling="2026-03-19 00:36:42.367944598 +0000 UTC m=+1766.906139729" lastFinishedPulling="2026-03-19 00:37:11.626187651 +0000 UTC m=+1796.164382782" observedRunningTime="2026-03-19 00:37:12.592266702 +0000 UTC m=+1797.130461833" watchObservedRunningTime="2026-03-19 00:37:12.597226327 +0000 UTC m=+1797.135421458" Mar 19 00:37:15 crc kubenswrapper[4745]: I0319 00:37:15.618548 4745 generic.go:334] "Generic (PLEG): container finished" podID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerID="12774089bafc3207d1c8eed8db8ca60cf7f5b505e9b2c47330a0acdcf61fbf8d" exitCode=0 Mar 19 00:37:15 crc kubenswrapper[4745]: I0319 00:37:15.618672 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerDied","Data":"12774089bafc3207d1c8eed8db8ca60cf7f5b505e9b2c47330a0acdcf61fbf8d"} Mar 19 00:37:21 crc kubenswrapper[4745]: I0319 00:37:21.140850 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:21 crc kubenswrapper[4745]: E0319 00:37:21.142174 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:22 crc kubenswrapper[4745]: I0319 00:37:22.982073 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.050684 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") pod \"57309d44-0759-4b3b-954f-8253b2f8a0b3\" (UID: \"57309d44-0759-4b3b-954f-8253b2f8a0b3\") " Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.065451 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf" (OuterVolumeSpecName: "kube-api-access-57qgf") pod "57309d44-0759-4b3b-954f-8253b2f8a0b3" (UID: "57309d44-0759-4b3b-954f-8253b2f8a0b3"). InnerVolumeSpecName "kube-api-access-57qgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.153695 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57qgf\" (UniqueName: \"kubernetes.io/projected/57309d44-0759-4b3b-954f-8253b2f8a0b3-kube-api-access-57qgf\") on node \"crc\" DevicePath \"\"" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.181098 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_curl_57309d44-0759-4b3b-954f-8253b2f8a0b3/curl/0.log" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.453014 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690045 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/curl" event={"ID":"57309d44-0759-4b3b-954f-8253b2f8a0b3","Type":"ContainerDied","Data":"739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38"} Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690109 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="739c75b2e0182c90e96d768d71238aef55c96dda097b3230911bab9794c28f38" Mar 19 00:37:23 crc kubenswrapper[4745]: I0319 00:37:23.690146 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/curl" Mar 19 00:37:25 crc kubenswrapper[4745]: I0319 00:37:25.714903 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7"} Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.138142 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:33 crc kubenswrapper[4745]: E0319 00:37:33.139153 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.796767 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerStarted","Data":"2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42"} Mar 19 00:37:33 crc kubenswrapper[4745]: I0319 00:37:33.821780 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/stf-smoketest-smoke1-9prq4" podStartSLOduration=2.40646354 podStartE2EDuration="22.821754961s" podCreationTimestamp="2026-03-19 00:37:11 +0000 UTC" firstStartedPulling="2026-03-19 00:37:12.324132166 +0000 UTC m=+1796.862327307" lastFinishedPulling="2026-03-19 00:37:32.739423597 +0000 UTC m=+1817.277618728" observedRunningTime="2026-03-19 00:37:33.814037869 +0000 UTC m=+1818.352233020" watchObservedRunningTime="2026-03-19 00:37:33.821754961 +0000 UTC m=+1818.359950092" Mar 19 00:37:46 crc kubenswrapper[4745]: I0319 00:37:46.143706 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:37:46 crc kubenswrapper[4745]: E0319 00:37:46.144909 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:37:53 crc kubenswrapper[4745]: I0319 00:37:53.624078 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:37:58 crc kubenswrapper[4745]: I0319 00:37:58.996869 4745 generic.go:334] "Generic (PLEG): container finished" podID="7de37e62-8066-4f02-85a3-4490078b4007" containerID="48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7" exitCode=0 Mar 19 00:37:58 crc kubenswrapper[4745]: I0319 00:37:58.997103 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7"} Mar 19 00:37:59 crc kubenswrapper[4745]: I0319 00:37:58.998197 4745 scope.go:117] "RemoveContainer" containerID="48caa8f3b4d5b8c1192e96bf757f22c649d20960c122f3e277ed9d83d18d42d7" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.147926 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:00 crc kubenswrapper[4745]: E0319 00:38:00.148265 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.148284 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.148469 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="57309d44-0759-4b3b-954f-8253b2f8a0b3" containerName="curl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.149148 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.151208 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.152350 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.152638 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.157607 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.161411 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.260278 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.285691 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"auto-csr-approver-29564678-m5hsl\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.487253 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:00 crc kubenswrapper[4745]: I0319 00:38:00.720779 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:38:01 crc kubenswrapper[4745]: I0319 00:38:01.014135 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerStarted","Data":"4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de"} Mar 19 00:38:01 crc kubenswrapper[4745]: I0319 00:38:01.138679 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:01 crc kubenswrapper[4745]: E0319 00:38:01.138955 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:02 crc kubenswrapper[4745]: I0319 00:38:02.026666 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerStarted","Data":"97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413"} Mar 19 00:38:02 crc kubenswrapper[4745]: I0319 00:38:02.044621 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" podStartSLOduration=1.076213796 podStartE2EDuration="2.04459914s" podCreationTimestamp="2026-03-19 00:38:00 +0000 UTC" firstStartedPulling="2026-03-19 00:38:00.726752542 +0000 UTC m=+1845.264947673" lastFinishedPulling="2026-03-19 00:38:01.695137886 +0000 UTC m=+1846.233333017" observedRunningTime="2026-03-19 00:38:02.041815033 +0000 UTC m=+1846.580010184" watchObservedRunningTime="2026-03-19 00:38:02.04459914 +0000 UTC m=+1846.582794271" Mar 19 00:38:03 crc kubenswrapper[4745]: I0319 00:38:03.036519 4745 generic.go:334] "Generic (PLEG): container finished" podID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerID="97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413" exitCode=0 Mar 19 00:38:03 crc kubenswrapper[4745]: I0319 00:38:03.036584 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerDied","Data":"97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413"} Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.292233 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.332563 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") pod \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\" (UID: \"c3ff29f0-5d28-4572-bd21-aac2f86091a8\") " Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.338797 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6" (OuterVolumeSpecName: "kube-api-access-v48z6") pod "c3ff29f0-5d28-4572-bd21-aac2f86091a8" (UID: "c3ff29f0-5d28-4572-bd21-aac2f86091a8"). InnerVolumeSpecName "kube-api-access-v48z6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:38:04 crc kubenswrapper[4745]: I0319 00:38:04.434627 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v48z6\" (UniqueName: \"kubernetes.io/projected/c3ff29f0-5d28-4572-bd21-aac2f86091a8-kube-api-access-v48z6\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.055147 4745 generic.go:334] "Generic (PLEG): container finished" podID="7de37e62-8066-4f02-85a3-4490078b4007" containerID="2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42" exitCode=0 Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.055253 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"2bde59b3410df5f3ca3524d81967b5360d24635e0a7e3f093cd6e92af7b10a42"} Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057599 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" event={"ID":"c3ff29f0-5d28-4572-bd21-aac2f86091a8","Type":"ContainerDied","Data":"4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de"} Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057634 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fe4236b2795b4676a30cbdd0f69a1604ccf5cd49f0c60a5899e82082e2165de" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.057707 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564678-m5hsl" Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.115821 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:38:05 crc kubenswrapper[4745]: I0319 00:38:05.124245 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564672-tpcgv"] Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.152659 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="550c50ae-5519-4c0d-b2b0-7415d134808f" path="/var/lib/kubelet/pods/550c50ae-5519-4c0d-b2b0-7415d134808f/volumes" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.331058 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385291 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385428 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385480 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385564 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385752 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385814 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.385850 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") pod \"7de37e62-8066-4f02-85a3-4490078b4007\" (UID: \"7de37e62-8066-4f02-85a3-4490078b4007\") " Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.410804 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config" (OuterVolumeSpecName: "collectd-config") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "collectd-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.412029 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config" (OuterVolumeSpecName: "sensubility-config") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "sensubility-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.412734 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log" (OuterVolumeSpecName: "healthcheck-log") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "healthcheck-log". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.414379 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj" (OuterVolumeSpecName: "kube-api-access-492lj") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "kube-api-access-492lj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.415702 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script" (OuterVolumeSpecName: "ceilometer-entrypoint-script") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "ceilometer-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.416013 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script" (OuterVolumeSpecName: "collectd-entrypoint-script") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "collectd-entrypoint-script". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.425853 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher" (OuterVolumeSpecName: "ceilometer-publisher") pod "7de37e62-8066-4f02-85a3-4490078b4007" (UID: "7de37e62-8066-4f02-85a3-4490078b4007"). InnerVolumeSpecName "ceilometer-publisher". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488411 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-492lj\" (UniqueName: \"kubernetes.io/projected/7de37e62-8066-4f02-85a3-4490078b4007-kube-api-access-492lj\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488469 4745 reconciler_common.go:293] "Volume detached for volume \"healthcheck-log\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-healthcheck-log\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488480 4745 reconciler_common.go:293] "Volume detached for volume \"collectd-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488489 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-entrypoint-script\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-entrypoint-script\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488499 4745 reconciler_common.go:293] "Volume detached for volume \"sensubility-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-sensubility-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488510 4745 reconciler_common.go:293] "Volume detached for volume \"collectd-config\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-collectd-config\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:06 crc kubenswrapper[4745]: I0319 00:38:06.488519 4745 reconciler_common.go:293] "Volume detached for volume \"ceilometer-publisher\" (UniqueName: \"kubernetes.io/configmap/7de37e62-8066-4f02-85a3-4490078b4007-ceilometer-publisher\") on node \"crc\" DevicePath \"\"" Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078297 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/stf-smoketest-smoke1-9prq4" event={"ID":"7de37e62-8066-4f02-85a3-4490078b4007","Type":"ContainerDied","Data":"e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184"} Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078763 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7f5113060a08f8b614b385962510239062de48487e344ad4eb895bea1703184" Mar 19 00:38:07 crc kubenswrapper[4745]: I0319 00:38:07.078435 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/stf-smoketest-smoke1-9prq4" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.280969 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-9prq4_7de37e62-8066-4f02-85a3-4490078b4007/smoketest-collectd/0.log" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.530434 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_stf-smoketest-smoke1-9prq4_7de37e62-8066-4f02-85a3-4490078b4007/smoketest-ceilometer/0.log" Mar 19 00:38:08 crc kubenswrapper[4745]: I0319 00:38:08.766781 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-interconnect-68864d46cb-wgvm9_b58f3451-d46e-43bc-8d65-cb9abbc9de0d/default-interconnect/0.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.001828 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_71276875-43be-4d09-a25d-4327369c3a53/bridge/2.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.290377 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-meter-smartgateway-7cd87f9766-lmvgg_71276875-43be-4d09-a25d-4327369c3a53/sg-core/0.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.558222 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_1d506c7b-246f-4aca-b3ac-635dbc53b579/bridge/2.log" Mar 19 00:38:09 crc kubenswrapper[4745]: I0319 00:38:09.775096 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-coll-event-smartgateway-7f44857896-xgs4c_1d506c7b-246f-4aca-b3ac-635dbc53b579/sg-core/0.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.062700 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_aa12986d-ffa2-4a08-9069-77fc4fdd80c6/bridge/2.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.294645 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-meter-smartgateway-57948895dc-9wmk6_aa12986d-ffa2-4a08-9069-77fc4fdd80c6/sg-core/0.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.527339 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_f88023b4-4c23-4946-a12b-3f0cdab93771/bridge/2.log" Mar 19 00:38:10 crc kubenswrapper[4745]: I0319 00:38:10.796572 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-ceil-event-smartgateway-695bfb6b8c-c2blw_f88023b4-4c23-4946-a12b-3f0cdab93771/sg-core/0.log" Mar 19 00:38:11 crc kubenswrapper[4745]: I0319 00:38:11.104165 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_651a6724-09fd-4395-859f-7fdff0781163/bridge/2.log" Mar 19 00:38:11 crc kubenswrapper[4745]: I0319 00:38:11.376667 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-cloud1-sens-meter-smartgateway-5759b4d97-vb8sl_651a6724-09fd-4395-859f-7fdff0781163/sg-core/0.log" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.137601 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:15 crc kubenswrapper[4745]: E0319 00:38:15.138277 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.484723 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-ff5d8cc8d-tcxb4_a8895e98-6a3f-4f8a-b671-9a7920ceb390/operator/0.log" Mar 19 00:38:15 crc kubenswrapper[4745]: I0319 00:38:15.767055 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_prometheus-default-0_7bba4496-9224-467b-80ca-ff25c39604ec/prometheus/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.029665 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_elasticsearch-es-default-0_884040c3-6c56-45b0-881d-e73f52c0ab34/elasticsearch/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.291437 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_default-snmp-webhook-6856cfb745-4pxj4_7ed8bcc5-1389-4cff-b64c-bf3b813f642e/prometheus-webhook-snmp/0.log" Mar 19 00:38:16 crc kubenswrapper[4745]: I0319 00:38:16.582138 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_alertmanager-default-0_0a13ce6d-9eb6-4d56-adfd-7c51a426c4b7/alertmanager/0.log" Mar 19 00:38:29 crc kubenswrapper[4745]: I0319 00:38:29.138499 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:29 crc kubenswrapper[4745]: E0319 00:38:29.139598 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:33 crc kubenswrapper[4745]: I0319 00:38:33.973402 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-c87c48cb6-d4c8j_4754eb2f-bab5-413c-ab43-3b9142082c2f/operator/0.log" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.878577 4745 scope.go:117] "RemoveContainer" containerID="ad4d25cdb1eb7abf3f1713a4b642271a43e4a8fa68c0fb36024884e82f682adb" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.934112 4745 scope.go:117] "RemoveContainer" containerID="f313fbb3c21be61114f257490bb0a77393588572276eeaf994f032d21e90ad1a" Mar 19 00:38:35 crc kubenswrapper[4745]: I0319 00:38:35.982684 4745 scope.go:117] "RemoveContainer" containerID="974a4b2b0120bf7547c402b35bf7ecab55db0da6f49394541dc6bc7af4cdda92" Mar 19 00:38:37 crc kubenswrapper[4745]: I0319 00:38:37.769932 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_smart-gateway-operator-ff5d8cc8d-tcxb4_a8895e98-6a3f-4f8a-b671-9a7920ceb390/operator/0.log" Mar 19 00:38:38 crc kubenswrapper[4745]: I0319 00:38:38.025855 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_qdr-test_9bc30d38-f540-480c-9289-45dbe7a4401b/qdr/0.log" Mar 19 00:38:43 crc kubenswrapper[4745]: I0319 00:38:43.138469 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:43 crc kubenswrapper[4745]: E0319 00:38:43.139649 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:38:56 crc kubenswrapper[4745]: I0319 00:38:56.143754 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:38:56 crc kubenswrapper[4745]: I0319 00:38:56.937361 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.931114 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932159 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932176 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932204 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: E0319 00:39:17.932218 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932228 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932385 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-ceilometer" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932399 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" containerName="oc" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.932408 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="7de37e62-8066-4f02-85a3-4490078b4007" containerName="smoketest-collectd" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.933175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.937361 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-45wdk"/"kube-root-ca.crt" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.937418 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-45wdk"/"openshift-service-ca.crt" Mar 19 00:39:17 crc kubenswrapper[4745]: I0319 00:39:17.946393 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.073751 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.073865 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.175675 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.175754 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.176479 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.202809 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"must-gather-j98n7\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.255605 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.859520 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:39:18 crc kubenswrapper[4745]: I0319 00:39:18.868598 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:39:19 crc kubenswrapper[4745]: I0319 00:39:19.129364 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"95b710504e843b1adaa747cda1ceed0cfff7d23d3901dd786e9517db4b1e363b"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.214844 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.216012 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerStarted","Data":"e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a"} Mar 19 00:39:27 crc kubenswrapper[4745]: I0319 00:39:27.237038 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-45wdk/must-gather-j98n7" podStartSLOduration=3.074163595 podStartE2EDuration="10.23701552s" podCreationTimestamp="2026-03-19 00:39:17 +0000 UTC" firstStartedPulling="2026-03-19 00:39:18.868534723 +0000 UTC m=+1923.406729854" lastFinishedPulling="2026-03-19 00:39:26.031386648 +0000 UTC m=+1930.569581779" observedRunningTime="2026-03-19 00:39:27.230767604 +0000 UTC m=+1931.768962755" watchObservedRunningTime="2026-03-19 00:39:27.23701552 +0000 UTC m=+1931.775210651" Mar 19 00:39:36 crc kubenswrapper[4745]: I0319 00:39:36.055561 4745 scope.go:117] "RemoveContainer" containerID="3d3abcaceec0d44feeaa99fe5fc507d2939843b4e5f2688e33f19d72f84aabe1" Mar 19 00:39:36 crc kubenswrapper[4745]: I0319 00:39:36.098962 4745 scope.go:117] "RemoveContainer" containerID="41cdf9f33044f6a4909a9e2e26ad76fb6b92253759abe1d7140516760d28b75c" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.227852 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.229579 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.239738 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.363718 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.464993 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.489174 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"infrawatch-operators-629gf\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.563551 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:43 crc kubenswrapper[4745]: I0319 00:39:43.908052 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.379123 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerStarted","Data":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.379621 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerStarted","Data":"d38bfb040a14025f30b9650d668f1ed36e039a8469a10993a3a97e36ef4d5129"} Mar 19 00:39:44 crc kubenswrapper[4745]: I0319 00:39:44.397900 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-629gf" podStartSLOduration=1.2599673120000001 podStartE2EDuration="1.397864081s" podCreationTimestamp="2026-03-19 00:39:43 +0000 UTC" firstStartedPulling="2026-03-19 00:39:43.907610686 +0000 UTC m=+1948.445805817" lastFinishedPulling="2026-03-19 00:39:44.045507455 +0000 UTC m=+1948.583702586" observedRunningTime="2026-03-19 00:39:44.394343631 +0000 UTC m=+1948.932538762" watchObservedRunningTime="2026-03-19 00:39:44.397864081 +0000 UTC m=+1948.936059212" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.564394 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.564905 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:53 crc kubenswrapper[4745]: I0319 00:39:53.600959 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:54 crc kubenswrapper[4745]: I0319 00:39:54.482568 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:54 crc kubenswrapper[4745]: I0319 00:39:54.531588 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.473111 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-629gf" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" containerID="cri-o://c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" gracePeriod=2 Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.861063 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.980181 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") pod \"f5af6c85-4fb0-4045-b2cb-f96258977b97\" (UID: \"f5af6c85-4fb0-4045-b2cb-f96258977b97\") " Mar 19 00:39:56 crc kubenswrapper[4745]: I0319 00:39:56.987675 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg" (OuterVolumeSpecName: "kube-api-access-vr7tg") pod "f5af6c85-4fb0-4045-b2cb-f96258977b97" (UID: "f5af6c85-4fb0-4045-b2cb-f96258977b97"). InnerVolumeSpecName "kube-api-access-vr7tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.081821 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vr7tg\" (UniqueName: \"kubernetes.io/projected/f5af6c85-4fb0-4045-b2cb-f96258977b97-kube-api-access-vr7tg\") on node \"crc\" DevicePath \"\"" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482289 4745 generic.go:334] "Generic (PLEG): container finished" podID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" exitCode=0 Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482349 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerDied","Data":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482389 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-629gf" event={"ID":"f5af6c85-4fb0-4045-b2cb-f96258977b97","Type":"ContainerDied","Data":"d38bfb040a14025f30b9650d668f1ed36e039a8469a10993a3a97e36ef4d5129"} Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482417 4745 scope.go:117] "RemoveContainer" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.482626 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-629gf" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.513785 4745 scope.go:117] "RemoveContainer" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: E0319 00:39:57.514437 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": container with ID starting with c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45 not found: ID does not exist" containerID="c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.514505 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45"} err="failed to get container status \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": rpc error: code = NotFound desc = could not find container \"c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45\": container with ID starting with c193a3408f7a5d54b89d15d5781f17401b19c84a58776217acfb835d2ff1cd45 not found: ID does not exist" Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.517806 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:57 crc kubenswrapper[4745]: I0319 00:39:57.527467 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-629gf"] Mar 19 00:39:58 crc kubenswrapper[4745]: I0319 00:39:58.147189 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" path="/var/lib/kubelet/pods/f5af6c85-4fb0-4045-b2cb-f96258977b97/volumes" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.147731 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:00 crc kubenswrapper[4745]: E0319 00:40:00.148529 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.148552 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.148706 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5af6c85-4fb0-4045-b2cb-f96258977b97" containerName="registry-server" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.149296 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.155898 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156114 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156187 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.156389 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.235625 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.338012 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.371540 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"auto-csr-approver-29564680-w752m\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.468013 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:00 crc kubenswrapper[4745]: I0319 00:40:00.691216 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:40:01 crc kubenswrapper[4745]: I0319 00:40:01.516647 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerStarted","Data":"86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f"} Mar 19 00:40:02 crc kubenswrapper[4745]: I0319 00:40:02.528831 4745 generic.go:334] "Generic (PLEG): container finished" podID="2afbbf4f-151b-4d25-9658-58353102abde" containerID="b11ec2f181d1b8a78640375e23f561adc4550277264f46fc103ead95a4d312d9" exitCode=0 Mar 19 00:40:02 crc kubenswrapper[4745]: I0319 00:40:02.528910 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerDied","Data":"b11ec2f181d1b8a78640375e23f561adc4550277264f46fc103ead95a4d312d9"} Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.790508 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.897238 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") pod \"2afbbf4f-151b-4d25-9658-58353102abde\" (UID: \"2afbbf4f-151b-4d25-9658-58353102abde\") " Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.905458 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7" (OuterVolumeSpecName: "kube-api-access-765v7") pod "2afbbf4f-151b-4d25-9658-58353102abde" (UID: "2afbbf4f-151b-4d25-9658-58353102abde"). InnerVolumeSpecName "kube-api-access-765v7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:40:03 crc kubenswrapper[4745]: I0319 00:40:03.999628 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-765v7\" (UniqueName: \"kubernetes.io/projected/2afbbf4f-151b-4d25-9658-58353102abde-kube-api-access-765v7\") on node \"crc\" DevicePath \"\"" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549473 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564680-w752m" event={"ID":"2afbbf4f-151b-4d25-9658-58353102abde","Type":"ContainerDied","Data":"86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f"} Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549920 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86721070d9a01d8c5d6051733892503901e00f811a580bb234089aa495d0de9f" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.549512 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564680-w752m" Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.866581 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:40:04 crc kubenswrapper[4745]: I0319 00:40:04.873219 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564674-2zhcp"] Mar 19 00:40:06 crc kubenswrapper[4745]: I0319 00:40:06.147599 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9fad81f-d73f-4e01-9a07-66b20741533e" path="/var/lib/kubelet/pods/c9fad81f-d73f-4e01-9a07-66b20741533e/volumes" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.612153 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-fztjk_cb0a157b-0f6d-4738-ae67-e29407c2ba8e/control-plane-machine-set-operator/0.log" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.774288 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hg72d_660e3fac-6534-49e0-a81e-38971c9fec3f/kube-rbac-proxy/0.log" Mar 19 00:40:08 crc kubenswrapper[4745]: I0319 00:40:08.818725 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-hg72d_660e3fac-6534-49e0-a81e-38971c9fec3f/machine-api-operator/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.680340 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-545d4d4674-ptrd5_93f48ad8-0863-4d90-abac-b887096b386c/cert-manager-controller/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.807351 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-5545bd876-cpzpl_bbe8b718-863a-404e-9be9-e872318f1ac0/cert-manager-cainjector/0.log" Mar 19 00:40:20 crc kubenswrapper[4745]: I0319 00:40:20.860849 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-6888856db4-vr5md_9f7ceaac-a9f7-467b-83c9-298813ff6323/cert-manager-webhook/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.761458 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tz2rm_bcf530c8-afe8-4a0e-9e5c-bfd85712e37a/prometheus-operator/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.881005 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_f5c4fe84-51cb-479a-a8cc-2e07bde21417/prometheus-operator-admission-webhook/0.log" Mar 19 00:40:34 crc kubenswrapper[4745]: I0319 00:40:34.957158 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_69d14850-5c50-4c06-8581-2a70644c7de7/prometheus-operator-admission-webhook/0.log" Mar 19 00:40:35 crc kubenswrapper[4745]: I0319 00:40:35.085897 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-shlz7_a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534/operator/0.log" Mar 19 00:40:35 crc kubenswrapper[4745]: I0319 00:40:35.184583 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-78548ff687-rjvkn_58d05d23-3632-4b84-94f8-1db548b90a03/perses-operator/0.log" Mar 19 00:40:36 crc kubenswrapper[4745]: I0319 00:40:36.170032 4745 scope.go:117] "RemoveContainer" containerID="6316938ec69e0de5c06031d730ac5d55047c4061f5dea80ee35cf89954f73a68" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.325214 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.513848 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.529013 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.546574 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.734131 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/util/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.735014 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/pull/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.768750 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fqszl2_9ef9c16b-de5e-456d-899c-15bdcfba6c89/extract/0.log" Mar 19 00:40:49 crc kubenswrapper[4745]: I0319 00:40:49.932354 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.102081 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.132364 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.137093 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.393277 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.403110 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.429895 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_7acef1e4a10e04db4e216682ff91f6a23804f55f83b8dd8f8f8f5ac39egs892_8baea7f9-4007-48cc-a849-7b8ce10c526b/extract/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.555943 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.722694 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.745566 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.782008 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.928293 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/util/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.956160 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/extract/0.log" Mar 19 00:40:50 crc kubenswrapper[4745]: I0319 00:40:50.959642 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5dw72l_4ef969f2-d76b-405e-baaf-c10a36d36ed3/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.090648 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.269113 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.286246 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.289741 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.471194 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/extract/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.480400 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/pull/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.497740 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf92397264g27d_6e2c290f-c398-4c6e-9dec-82038e0bda08/util/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.657598 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.826955 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.846595 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:51 crc kubenswrapper[4745]: I0319 00:40:51.894158 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.067163 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.075978 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.438191 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-s8x87_7ce3eedc-9f09-4a12-8f4b-b7dd6af3034b/registry-server/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.473664 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.654574 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.688847 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.714029 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.835736 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-utilities/0.log" Mar 19 00:40:52 crc kubenswrapper[4745]: I0319 00:40:52.872244 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.084167 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-hcgn8_9dc2e0fe-a8bd-4a4f-9ee2-1685bc395d06/marketplace-operator/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.249321 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.337129 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-pq2gm_0ff66d22-2b4c-4f11-acfe-06173ee9a07e/registry-server/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.465972 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.505345 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:53 crc kubenswrapper[4745]: I0319 00:40:53.517823 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.300733 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-utilities/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.319970 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/extract-content/0.log" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.714904 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:54 crc kubenswrapper[4745]: E0319 00:40:54.715358 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.715380 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.715575 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2afbbf4f-151b-4d25-9658-58353102abde" containerName="oc" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.721225 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.738607 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854337 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854462 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.854500 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956370 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956451 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.956483 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.957103 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.957934 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:54 crc kubenswrapper[4745]: I0319 00:40:54.986491 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"community-operators-k2n7t\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.375940 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.401975 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-6ddfl_e5ffb13a-ff99-429d-bfdc-cdd2a43b90c5/registry-server/0.log" Mar 19 00:40:55 crc kubenswrapper[4745]: I0319 00:40:55.978373 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485434 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" exitCode=0 Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8"} Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.485829 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerStarted","Data":"66be26ed0cc5fc431474d754705969bca8f867ddcc5eb902f18bf1d6f9dc4737"} Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.553369 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.555106 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.571990 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697838 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697917 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.697953 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.799416 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.799979 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.800038 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.800740 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.801222 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.823369 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"certified-operators-7tnv7\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:56 crc kubenswrapper[4745]: I0319 00:40:56.877999 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:40:58 crc kubenswrapper[4745]: I0319 00:40:58.592140 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:40:58 crc kubenswrapper[4745]: W0319 00:40:58.602440 4745 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8674bfe_de0e_4acb_a1c3_e8a3bdca029c.slice/crio-180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506 WatchSource:0}: Error finding container 180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506: Status 404 returned error can't find the container with id 180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506 Mar 19 00:40:58 crc kubenswrapper[4745]: I0319 00:40:58.907777 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506"} Mar 19 00:41:00 crc kubenswrapper[4745]: I0319 00:41:00.209590 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" exitCode=0 Mar 19 00:41:00 crc kubenswrapper[4745]: I0319 00:41:00.209788 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139"} Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.220868 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.223649 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" exitCode=0 Mar 19 00:41:01 crc kubenswrapper[4745]: I0319 00:41:01.223723 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.234264 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" exitCode=0 Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.234367 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.238092 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerStarted","Data":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} Mar 19 00:41:02 crc kubenswrapper[4745]: I0319 00:41:02.291418 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-k2n7t" podStartSLOduration=2.944246911 podStartE2EDuration="8.291390834s" podCreationTimestamp="2026-03-19 00:40:54 +0000 UTC" firstStartedPulling="2026-03-19 00:40:56.488517523 +0000 UTC m=+2021.026712654" lastFinishedPulling="2026-03-19 00:41:01.835661446 +0000 UTC m=+2026.373856577" observedRunningTime="2026-03-19 00:41:02.284482404 +0000 UTC m=+2026.822677555" watchObservedRunningTime="2026-03-19 00:41:02.291390834 +0000 UTC m=+2026.829585965" Mar 19 00:41:03 crc kubenswrapper[4745]: I0319 00:41:03.250398 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerStarted","Data":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} Mar 19 00:41:03 crc kubenswrapper[4745]: I0319 00:41:03.276487 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7tnv7" podStartSLOduration=4.842246496 podStartE2EDuration="7.276463797s" podCreationTimestamp="2026-03-19 00:40:56 +0000 UTC" firstStartedPulling="2026-03-19 00:41:00.213564675 +0000 UTC m=+2024.751759806" lastFinishedPulling="2026-03-19 00:41:02.647781976 +0000 UTC m=+2027.185977107" observedRunningTime="2026-03-19 00:41:03.275359262 +0000 UTC m=+2027.813554403" watchObservedRunningTime="2026-03-19 00:41:03.276463797 +0000 UTC m=+2027.814658928" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.377408 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.377921 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:05 crc kubenswrapper[4745]: I0319 00:41:05.433802 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.879299 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.880651 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:06 crc kubenswrapper[4745]: I0319 00:41:06.921927 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.327826 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.328398 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:07 crc kubenswrapper[4745]: I0319 00:41:07.946526 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.294678 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7tnv7" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" containerID="cri-o://b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" gracePeriod=2 Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.745034 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:09 crc kubenswrapper[4745]: I0319 00:41:09.745310 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-k2n7t" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" containerID="cri-o://768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" gracePeriod=2 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.273869 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.279164 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.307317 4745 generic.go:334] "Generic (PLEG): container finished" podID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" exitCode=0 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308246 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7tnv7" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308678 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308721 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7tnv7" event={"ID":"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c","Type":"ContainerDied","Data":"180e4f33d148eb18a584113ea9b1a9b44e153f30ce8ff2099f5400b22fe52506"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.308740 4745 scope.go:117] "RemoveContainer" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320509 4745 generic.go:334] "Generic (PLEG): container finished" podID="2941df91-78ca-4017-94ec-60f34ac379a1" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" exitCode=0 Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320586 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320643 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-k2n7t" event={"ID":"2941df91-78ca-4017-94ec-60f34ac379a1","Type":"ContainerDied","Data":"66be26ed0cc5fc431474d754705969bca8f867ddcc5eb902f18bf1d6f9dc4737"} Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.320753 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-k2n7t" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.333686 4745 scope.go:117] "RemoveContainer" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341768 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341909 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.341975 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") pod \"2941df91-78ca-4017-94ec-60f34ac379a1\" (UID: \"2941df91-78ca-4017-94ec-60f34ac379a1\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342034 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342085 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.342125 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") pod \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\" (UID: \"d8674bfe-de0e-4acb-a1c3-e8a3bdca029c\") " Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.345455 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities" (OuterVolumeSpecName: "utilities") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.350084 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities" (OuterVolumeSpecName: "utilities") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.372177 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4" (OuterVolumeSpecName: "kube-api-access-g7sf4") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "kube-api-access-g7sf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.372266 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x" (OuterVolumeSpecName: "kube-api-access-qpb4x") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "kube-api-access-qpb4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.407245 4745 scope.go:117] "RemoveContainer" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444817 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444855 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7sf4\" (UniqueName: \"kubernetes.io/projected/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-kube-api-access-g7sf4\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444865 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.444873 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpb4x\" (UniqueName: \"kubernetes.io/projected/2941df91-78ca-4017-94ec-60f34ac379a1-kube-api-access-qpb4x\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.477383 4745 scope.go:117] "RemoveContainer" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.478143 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": container with ID starting with b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf not found: ID does not exist" containerID="b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478175 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf"} err="failed to get container status \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": rpc error: code = NotFound desc = could not find container \"b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf\": container with ID starting with b109e2bb9ea44c937d074e799503411fc9d99a58765884435aa7553a98fa0fbf not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478210 4745 scope.go:117] "RemoveContainer" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.478730 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": container with ID starting with 00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1 not found: ID does not exist" containerID="00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478753 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1"} err="failed to get container status \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": rpc error: code = NotFound desc = could not find container \"00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1\": container with ID starting with 00efd0b99cf8698832d9ecb4b731a7723cc479df348f386eb3dab08dfd9345a1 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.478769 4745 scope.go:117] "RemoveContainer" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.479380 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": container with ID starting with 7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139 not found: ID does not exist" containerID="7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.479445 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139"} err="failed to get container status \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": rpc error: code = NotFound desc = could not find container \"7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139\": container with ID starting with 7270be1569e44f384f59c6e573143e80d94f19206918759b3263e976f8d2c139 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.479478 4745 scope.go:117] "RemoveContainer" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.495133 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" (UID: "d8674bfe-de0e-4acb-a1c3-e8a3bdca029c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.504603 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2941df91-78ca-4017-94ec-60f34ac379a1" (UID: "2941df91-78ca-4017-94ec-60f34ac379a1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.506677 4745 scope.go:117] "RemoveContainer" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.529186 4745 scope.go:117] "RemoveContainer" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.546843 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2941df91-78ca-4017-94ec-60f34ac379a1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.546900 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.555938 4745 scope.go:117] "RemoveContainer" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.556635 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": container with ID starting with 768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478 not found: ID does not exist" containerID="768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.556686 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478"} err="failed to get container status \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": rpc error: code = NotFound desc = could not find container \"768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478\": container with ID starting with 768439f9e473cc4fee950e811dc923ee7f4b26bd24add626d7de668708eba478 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.556817 4745 scope.go:117] "RemoveContainer" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.557238 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": container with ID starting with 7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126 not found: ID does not exist" containerID="7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557291 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126"} err="failed to get container status \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": rpc error: code = NotFound desc = could not find container \"7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126\": container with ID starting with 7efac035a668b5ae7a1109d190116c7c2c59b1641149f723e1694d05d3f1c126 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557335 4745 scope.go:117] "RemoveContainer" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: E0319 00:41:10.557751 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": container with ID starting with 07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8 not found: ID does not exist" containerID="07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.557823 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8"} err="failed to get container status \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": rpc error: code = NotFound desc = could not find container \"07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8\": container with ID starting with 07be6baeabf3f0ff3b5dc094e2323b2fceb0c8de6258a3ad7f2a506b37f3caf8 not found: ID does not exist" Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.642782 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.653978 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7tnv7"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.662562 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:10 crc kubenswrapper[4745]: I0319 00:41:10.671415 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-k2n7t"] Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.243092 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-tz2rm_bcf530c8-afe8-4a0e-9e5c-bfd85712e37a/prometheus-operator/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.295962 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-d95n2_f5c4fe84-51cb-479a-a8cc-2e07bde21417/prometheus-operator-admission-webhook/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.348813 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57d8dc485d-qxd78_69d14850-5c50-4c06-8581-2a70644c7de7/prometheus-operator-admission-webhook/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.431598 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-shlz7_a7c2afa4-a0d4-4aad-b6ad-31b7cb4c9534/operator/0.log" Mar 19 00:41:11 crc kubenswrapper[4745]: I0319 00:41:11.530008 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-78548ff687-rjvkn_58d05d23-3632-4b84-94f8-1db548b90a03/perses-operator/0.log" Mar 19 00:41:12 crc kubenswrapper[4745]: I0319 00:41:12.151841 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" path="/var/lib/kubelet/pods/2941df91-78ca-4017-94ec-60f34ac379a1/volumes" Mar 19 00:41:12 crc kubenswrapper[4745]: I0319 00:41:12.152695 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" path="/var/lib/kubelet/pods/d8674bfe-de0e-4acb-a1c3-e8a3bdca029c/volumes" Mar 19 00:41:15 crc kubenswrapper[4745]: I0319 00:41:15.606243 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:41:15 crc kubenswrapper[4745]: I0319 00:41:15.608039 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:41:45 crc kubenswrapper[4745]: I0319 00:41:45.606364 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:41:45 crc kubenswrapper[4745]: I0319 00:41:45.606966 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.153149 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154197 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154210 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154227 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154234 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154247 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154255 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154271 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154276 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-content" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154288 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154294 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="extract-utilities" Mar 19 00:42:00 crc kubenswrapper[4745]: E0319 00:42:00.154310 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154317 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154452 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="2941df91-78ca-4017-94ec-60f34ac379a1" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.154465 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8674bfe-de0e-4acb-a1c3-e8a3bdca029c" containerName="registry-server" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.155179 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.158632 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.158725 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.159610 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.165016 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.326901 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.430351 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.457390 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"auto-csr-approver-29564682-qhdlb\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.483248 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.711206 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564682-qhdlb"] Mar 19 00:42:00 crc kubenswrapper[4745]: I0319 00:42:00.768236 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerStarted","Data":"9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb"} Mar 19 00:42:02 crc kubenswrapper[4745]: I0319 00:42:02.799085 4745 generic.go:334] "Generic (PLEG): container finished" podID="cb19dc97-c11c-4239-9119-66b62533468d" containerID="989ddcb40e12489a6193192610525ba17add6f9232e7834040bccd51c493446e" exitCode=0 Mar 19 00:42:02 crc kubenswrapper[4745]: I0319 00:42:02.799510 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerDied","Data":"989ddcb40e12489a6193192610525ba17add6f9232e7834040bccd51c493446e"} Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.065614 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.210920 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") pod \"cb19dc97-c11c-4239-9119-66b62533468d\" (UID: \"cb19dc97-c11c-4239-9119-66b62533468d\") " Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.231989 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r" (OuterVolumeSpecName: "kube-api-access-q5w9r") pod "cb19dc97-c11c-4239-9119-66b62533468d" (UID: "cb19dc97-c11c-4239-9119-66b62533468d"). InnerVolumeSpecName "kube-api-access-q5w9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.312869 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5w9r\" (UniqueName: \"kubernetes.io/projected/cb19dc97-c11c-4239-9119-66b62533468d-kube-api-access-q5w9r\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827093 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" event={"ID":"cb19dc97-c11c-4239-9119-66b62533468d","Type":"ContainerDied","Data":"9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb"} Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827450 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb418b15c5055fa4b6c8739ea732b403f78c464de0844e12bbc3d3cdaaaaafb" Mar 19 00:42:04 crc kubenswrapper[4745]: I0319 00:42:04.827360 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564682-qhdlb" Mar 19 00:42:05 crc kubenswrapper[4745]: I0319 00:42:05.144497 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:42:05 crc kubenswrapper[4745]: I0319 00:42:05.150930 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564676-6jsmm"] Mar 19 00:42:06 crc kubenswrapper[4745]: I0319 00:42:06.148133 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b28c36f9-bd14-4de4-a0b3-e3f5e9131f60" path="/var/lib/kubelet/pods/b28c36f9-bd14-4de4-a0b3-e3f5e9131f60/volumes" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.606440 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607065 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607122 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607674 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.607732 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01" gracePeriod=600 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.915317 4745 generic.go:334] "Generic (PLEG): container finished" podID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" exitCode=0 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.915414 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-45wdk/must-gather-j98n7" event={"ID":"08615e98-17b3-40c9-8b9b-e372a9ca1b04","Type":"ContainerDied","Data":"e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a"} Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.916307 4745 scope.go:117] "RemoveContainer" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922176 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01" exitCode=0 Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922229 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01"} Mar 19 00:42:15 crc kubenswrapper[4745]: I0319 00:42:15.922279 4745 scope.go:117] "RemoveContainer" containerID="14a9d15c27ad4c171feb9ac1bfbb0e7f93a835f093cc541e0e466cce492d65b2" Mar 19 00:42:16 crc kubenswrapper[4745]: I0319 00:42:16.318340 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/gather/0.log" Mar 19 00:42:16 crc kubenswrapper[4745]: I0319 00:42:16.932559 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerStarted","Data":"26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa"} Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.639094 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.640436 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-45wdk/must-gather-j98n7" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" containerID="cri-o://9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" gracePeriod=2 Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.646630 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-45wdk/must-gather-j98n7"] Mar 19 00:42:23 crc kubenswrapper[4745]: I0319 00:42:23.999105 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.000086 4745 generic.go:334] "Generic (PLEG): container finished" podID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerID="9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" exitCode=143 Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.092344 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.093120 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.134093 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") pod \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.134220 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") pod \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\" (UID: \"08615e98-17b3-40c9-8b9b-e372a9ca1b04\") " Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.143290 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b" (OuterVolumeSpecName: "kube-api-access-ncd7b") pod "08615e98-17b3-40c9-8b9b-e372a9ca1b04" (UID: "08615e98-17b3-40c9-8b9b-e372a9ca1b04"). InnerVolumeSpecName "kube-api-access-ncd7b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.209897 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "08615e98-17b3-40c9-8b9b-e372a9ca1b04" (UID: "08615e98-17b3-40c9-8b9b-e372a9ca1b04"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.236139 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ncd7b\" (UniqueName: \"kubernetes.io/projected/08615e98-17b3-40c9-8b9b-e372a9ca1b04-kube-api-access-ncd7b\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:24 crc kubenswrapper[4745]: I0319 00:42:24.236212 4745 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/08615e98-17b3-40c9-8b9b-e372a9ca1b04-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.011897 4745 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-45wdk_must-gather-j98n7_08615e98-17b3-40c9-8b9b-e372a9ca1b04/copy/0.log" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.013086 4745 scope.go:117] "RemoveContainer" containerID="9910ba3cd8a666005f7972e55a88ec614e51f3eed198ea44dfa464904fab77e2" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.013216 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-45wdk/must-gather-j98n7" Mar 19 00:42:25 crc kubenswrapper[4745]: I0319 00:42:25.046480 4745 scope.go:117] "RemoveContainer" containerID="e08cee8d6016575c3039d3145b9b52e25c4da1ffbc14b55a1a247732682d9c5a" Mar 19 00:42:26 crc kubenswrapper[4745]: I0319 00:42:26.147520 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" path="/var/lib/kubelet/pods/08615e98-17b3-40c9-8b9b-e372a9ca1b04/volumes" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.528022 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529289 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529308 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529337 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529346 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: E0319 00:42:35.529370 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529379 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529519 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="copy" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529531 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb19dc97-c11c-4239-9119-66b62533468d" containerName="oc" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.529553 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="08615e98-17b3-40c9-8b9b-e372a9ca1b04" containerName="gather" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.530721 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.545080 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630287 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630630 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.630792 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732242 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732323 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732372 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732984 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.732984 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.756815 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"redhat-operators-pn5dd\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:35 crc kubenswrapper[4745]: I0319 00:42:35.852175 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:36 crc kubenswrapper[4745]: I0319 00:42:36.168153 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:36 crc kubenswrapper[4745]: I0319 00:42:36.311456 4745 scope.go:117] "RemoveContainer" containerID="4de285e74e641eb21f2a1ee98f32c5f610d3c8d1a0fc10bb8a444c82e684e43e" Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.147962 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" exitCode=0 Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.148656 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d"} Mar 19 00:42:37 crc kubenswrapper[4745]: I0319 00:42:37.148713 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"58c40ca0c8930bf8462ccde33f2505afcaa5f046f05f364b86a83cb6df27a1e3"} Mar 19 00:42:39 crc kubenswrapper[4745]: I0319 00:42:39.177864 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} Mar 19 00:42:40 crc kubenswrapper[4745]: I0319 00:42:40.187498 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" exitCode=0 Mar 19 00:42:40 crc kubenswrapper[4745]: I0319 00:42:40.187569 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} Mar 19 00:42:41 crc kubenswrapper[4745]: I0319 00:42:41.200216 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerStarted","Data":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} Mar 19 00:42:41 crc kubenswrapper[4745]: I0319 00:42:41.225458 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-pn5dd" podStartSLOduration=2.762867799 podStartE2EDuration="6.225430783s" podCreationTimestamp="2026-03-19 00:42:35 +0000 UTC" firstStartedPulling="2026-03-19 00:42:37.154694911 +0000 UTC m=+2121.692890042" lastFinishedPulling="2026-03-19 00:42:40.617257895 +0000 UTC m=+2125.155453026" observedRunningTime="2026-03-19 00:42:41.218706328 +0000 UTC m=+2125.756901479" watchObservedRunningTime="2026-03-19 00:42:41.225430783 +0000 UTC m=+2125.763625914" Mar 19 00:42:45 crc kubenswrapper[4745]: I0319 00:42:45.853179 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:45 crc kubenswrapper[4745]: I0319 00:42:45.853965 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:46 crc kubenswrapper[4745]: I0319 00:42:46.904837 4745 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-pn5dd" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" probeResult="failure" output=< Mar 19 00:42:46 crc kubenswrapper[4745]: timeout: failed to connect service ":50051" within 1s Mar 19 00:42:46 crc kubenswrapper[4745]: > Mar 19 00:42:55 crc kubenswrapper[4745]: I0319 00:42:55.913867 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:55 crc kubenswrapper[4745]: I0319 00:42:55.963147 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:56 crc kubenswrapper[4745]: I0319 00:42:56.180903 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.331559 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-pn5dd" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" containerID="cri-o://185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" gracePeriod=2 Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.727450 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.804530 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.804625 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.805910 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities" (OuterVolumeSpecName: "utilities") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.806197 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") pod \"512efe3d-191c-49b8-bd41-897707ccc697\" (UID: \"512efe3d-191c-49b8-bd41-897707ccc697\") " Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.807822 4745 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.811991 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz" (OuterVolumeSpecName: "kube-api-access-xhxqz") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "kube-api-access-xhxqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.908951 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhxqz\" (UniqueName: \"kubernetes.io/projected/512efe3d-191c-49b8-bd41-897707ccc697-kube-api-access-xhxqz\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:57 crc kubenswrapper[4745]: I0319 00:42:57.949522 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "512efe3d-191c-49b8-bd41-897707ccc697" (UID: "512efe3d-191c-49b8-bd41-897707ccc697"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.010434 4745 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/512efe3d-191c-49b8-bd41-897707ccc697-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343712 4745 generic.go:334] "Generic (PLEG): container finished" podID="512efe3d-191c-49b8-bd41-897707ccc697" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" exitCode=0 Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343804 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-pn5dd" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.343830 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.344691 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-pn5dd" event={"ID":"512efe3d-191c-49b8-bd41-897707ccc697","Type":"ContainerDied","Data":"58c40ca0c8930bf8462ccde33f2505afcaa5f046f05f364b86a83cb6df27a1e3"} Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.344727 4745 scope.go:117] "RemoveContainer" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.617776 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.618331 4745 scope.go:117] "RemoveContainer" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.624612 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-pn5dd"] Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.643318 4745 scope.go:117] "RemoveContainer" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.667553 4745 scope.go:117] "RemoveContainer" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668117 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": container with ID starting with 185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b not found: ID does not exist" containerID="185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668155 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b"} err="failed to get container status \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": rpc error: code = NotFound desc = could not find container \"185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b\": container with ID starting with 185dd56b2bd1ad1cc08cb4bb2c2f681c07d8f6570177609d3b8553bca0e4ec3b not found: ID does not exist" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668184 4745 scope.go:117] "RemoveContainer" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668448 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": container with ID starting with 5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719 not found: ID does not exist" containerID="5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668667 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719"} err="failed to get container status \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": rpc error: code = NotFound desc = could not find container \"5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719\": container with ID starting with 5a6e1bea36cde28720c8a97bee2f1b76952e797a666bd92f5d9c65527f904719 not found: ID does not exist" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668691 4745 scope.go:117] "RemoveContainer" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: E0319 00:42:58.668921 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": container with ID starting with a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d not found: ID does not exist" containerID="a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d" Mar 19 00:42:58 crc kubenswrapper[4745]: I0319 00:42:58.668947 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d"} err="failed to get container status \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": rpc error: code = NotFound desc = could not find container \"a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d\": container with ID starting with a03277c8e30ff87a7c4a8383017c5af022490b3f248e7078fddd77dc21177e4d not found: ID does not exist" Mar 19 00:43:00 crc kubenswrapper[4745]: I0319 00:43:00.149601 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="512efe3d-191c-49b8-bd41-897707ccc697" path="/var/lib/kubelet/pods/512efe3d-191c-49b8-bd41-897707ccc697/volumes" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.152597 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153738 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-utilities" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153761 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-utilities" Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153777 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-content" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153784 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="extract-content" Mar 19 00:44:00 crc kubenswrapper[4745]: E0319 00:44:00.153807 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.153813 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.154047 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="512efe3d-191c-49b8-bd41-897707ccc697" containerName="registry-server" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.155616 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159075 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159094 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.159079 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.161298 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.257376 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.359198 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.381013 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"auto-csr-approver-29564684-ltrd8\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.480648 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:00 crc kubenswrapper[4745]: I0319 00:44:00.945241 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564684-ltrd8"] Mar 19 00:44:01 crc kubenswrapper[4745]: I0319 00:44:01.298047 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerStarted","Data":"ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9"} Mar 19 00:44:02 crc kubenswrapper[4745]: I0319 00:44:02.308960 4745 generic.go:334] "Generic (PLEG): container finished" podID="be744c99-b821-4dc8-8e93-92afcd2ca04a" containerID="95dd962052ffe49051aaddb3d90230a87d8c24e2d8b45a7954e0d61fef0111b0" exitCode=0 Mar 19 00:44:02 crc kubenswrapper[4745]: I0319 00:44:02.309105 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerDied","Data":"95dd962052ffe49051aaddb3d90230a87d8c24e2d8b45a7954e0d61fef0111b0"} Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.599795 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.718752 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") pod \"be744c99-b821-4dc8-8e93-92afcd2ca04a\" (UID: \"be744c99-b821-4dc8-8e93-92afcd2ca04a\") " Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.724297 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj" (OuterVolumeSpecName: "kube-api-access-6qfcj") pod "be744c99-b821-4dc8-8e93-92afcd2ca04a" (UID: "be744c99-b821-4dc8-8e93-92afcd2ca04a"). InnerVolumeSpecName "kube-api-access-6qfcj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:44:03 crc kubenswrapper[4745]: I0319 00:44:03.821225 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qfcj\" (UniqueName: \"kubernetes.io/projected/be744c99-b821-4dc8-8e93-92afcd2ca04a-kube-api-access-6qfcj\") on node \"crc\" DevicePath \"\"" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325763 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" event={"ID":"be744c99-b821-4dc8-8e93-92afcd2ca04a","Type":"ContainerDied","Data":"ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9"} Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325817 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ee71b4cb1ead71c102b39038fdf07cac65c3dd0a4c4d36b10b9ba5fc46461cd9" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.325819 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564684-ltrd8" Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.679309 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:44:04 crc kubenswrapper[4745]: I0319 00:44:04.686201 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564678-m5hsl"] Mar 19 00:44:06 crc kubenswrapper[4745]: I0319 00:44:06.148521 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3ff29f0-5d28-4572-bd21-aac2f86091a8" path="/var/lib/kubelet/pods/c3ff29f0-5d28-4572-bd21-aac2f86091a8/volumes" Mar 19 00:44:15 crc kubenswrapper[4745]: I0319 00:44:15.606480 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:44:15 crc kubenswrapper[4745]: I0319 00:44:15.607270 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:44:36 crc kubenswrapper[4745]: I0319 00:44:36.446996 4745 scope.go:117] "RemoveContainer" containerID="97740a413cb8dbbe5a9163cc3b0f901c899139f75274026549c0e4cc3732d413" Mar 19 00:44:45 crc kubenswrapper[4745]: I0319 00:44:45.606459 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:44:45 crc kubenswrapper[4745]: I0319 00:44:45.607204 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.160977 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w"] Mar 19 00:45:00 crc kubenswrapper[4745]: E0319 00:45:00.162585 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be744c99-b821-4dc8-8e93-92afcd2ca04a" containerName="oc" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.162609 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="be744c99-b821-4dc8-8e93-92afcd2ca04a" containerName="oc" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.162906 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="be744c99-b821-4dc8-8e93-92afcd2ca04a" containerName="oc" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.163627 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.166598 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.167358 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.172010 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w"] Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.334352 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.334458 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.334480 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45njq\" (UniqueName: \"kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.436311 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45njq\" (UniqueName: \"kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.436468 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.436525 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.438133 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.448066 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.455537 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45njq\" (UniqueName: \"kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq\") pod \"collect-profiles-29564685-d862w\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.506521 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:00 crc kubenswrapper[4745]: I0319 00:45:00.994867 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w"] Mar 19 00:45:01 crc kubenswrapper[4745]: I0319 00:45:01.816661 4745 generic.go:334] "Generic (PLEG): container finished" podID="01776a81-2c53-4be6-976d-8eb715ee32ca" containerID="27bd4b3a0e8276ec4526b43b4c69c46511a95633d7f2f98b2be66abd789ac194" exitCode=0 Mar 19 00:45:01 crc kubenswrapper[4745]: I0319 00:45:01.816722 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" event={"ID":"01776a81-2c53-4be6-976d-8eb715ee32ca","Type":"ContainerDied","Data":"27bd4b3a0e8276ec4526b43b4c69c46511a95633d7f2f98b2be66abd789ac194"} Mar 19 00:45:01 crc kubenswrapper[4745]: I0319 00:45:01.816760 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" event={"ID":"01776a81-2c53-4be6-976d-8eb715ee32ca","Type":"ContainerStarted","Data":"e5b96fd814c45d8c3edaddfb9322d7b8cfcfa4b3522e5e8265909f8f4a605b8a"} Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.097917 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.291592 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume\") pod \"01776a81-2c53-4be6-976d-8eb715ee32ca\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.292327 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45njq\" (UniqueName: \"kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq\") pod \"01776a81-2c53-4be6-976d-8eb715ee32ca\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.292442 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume\") pod \"01776a81-2c53-4be6-976d-8eb715ee32ca\" (UID: \"01776a81-2c53-4be6-976d-8eb715ee32ca\") " Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.294628 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume" (OuterVolumeSpecName: "config-volume") pod "01776a81-2c53-4be6-976d-8eb715ee32ca" (UID: "01776a81-2c53-4be6-976d-8eb715ee32ca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.306386 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq" (OuterVolumeSpecName: "kube-api-access-45njq") pod "01776a81-2c53-4be6-976d-8eb715ee32ca" (UID: "01776a81-2c53-4be6-976d-8eb715ee32ca"). InnerVolumeSpecName "kube-api-access-45njq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.313552 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "01776a81-2c53-4be6-976d-8eb715ee32ca" (UID: "01776a81-2c53-4be6-976d-8eb715ee32ca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.395415 4745 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/01776a81-2c53-4be6-976d-8eb715ee32ca-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.395466 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45njq\" (UniqueName: \"kubernetes.io/projected/01776a81-2c53-4be6-976d-8eb715ee32ca-kube-api-access-45njq\") on node \"crc\" DevicePath \"\"" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.395481 4745 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/01776a81-2c53-4be6-976d-8eb715ee32ca-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.835968 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" event={"ID":"01776a81-2c53-4be6-976d-8eb715ee32ca","Type":"ContainerDied","Data":"e5b96fd814c45d8c3edaddfb9322d7b8cfcfa4b3522e5e8265909f8f4a605b8a"} Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.836032 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e5b96fd814c45d8c3edaddfb9322d7b8cfcfa4b3522e5e8265909f8f4a605b8a" Mar 19 00:45:03 crc kubenswrapper[4745]: I0319 00:45:03.836049 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29564685-d862w" Mar 19 00:45:04 crc kubenswrapper[4745]: I0319 00:45:04.183034 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:45:04 crc kubenswrapper[4745]: I0319 00:45:04.190383 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29564640-mn6rg"] Mar 19 00:45:06 crc kubenswrapper[4745]: I0319 00:45:06.148899 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75bf4c3d-1ce3-48df-8598-7f72667807c1" path="/var/lib/kubelet/pods/75bf4c3d-1ce3-48df-8598-7f72667807c1/volumes" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.607087 4745 patch_prober.go:28] interesting pod/machine-config-daemon-qt5t5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.607977 4745 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.608054 4745 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.609293 4745 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa"} pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.609407 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" containerName="machine-config-daemon" containerID="cri-o://26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" gracePeriod=600 Mar 19 00:45:15 crc kubenswrapper[4745]: E0319 00:45:15.735437 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.954896 4745 generic.go:334] "Generic (PLEG): container finished" podID="400972f4-050f-4f26-b982-ced6f2590c8b" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" exitCode=0 Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.954932 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" event={"ID":"400972f4-050f-4f26-b982-ced6f2590c8b","Type":"ContainerDied","Data":"26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa"} Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.955019 4745 scope.go:117] "RemoveContainer" containerID="12dd2a3323471bce1c5eddd5688e4d60bfdf81298a99430d997466c1833e7d01" Mar 19 00:45:15 crc kubenswrapper[4745]: I0319 00:45:15.955711 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:45:15 crc kubenswrapper[4745]: E0319 00:45:15.956046 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:45:30 crc kubenswrapper[4745]: I0319 00:45:30.138648 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:45:30 crc kubenswrapper[4745]: E0319 00:45:30.139983 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:45:36 crc kubenswrapper[4745]: I0319 00:45:36.525041 4745 scope.go:117] "RemoveContainer" containerID="9e7db3c4b8160a045a6441db451fbc03b58d9027bbe08bfa7d59fe62a3ed7321" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.488802 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:37 crc kubenswrapper[4745]: E0319 00:45:37.489177 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01776a81-2c53-4be6-976d-8eb715ee32ca" containerName="collect-profiles" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.489191 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="01776a81-2c53-4be6-976d-8eb715ee32ca" containerName="collect-profiles" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.489317 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="01776a81-2c53-4be6-976d-8eb715ee32ca" containerName="collect-profiles" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.489863 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.541022 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.590147 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh\") pod \"infrawatch-operators-dtwsv\" (UID: \"05bace8f-ac1e-4ffa-9ab5-b5746226360b\") " pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.691342 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh\") pod \"infrawatch-operators-dtwsv\" (UID: \"05bace8f-ac1e-4ffa-9ab5-b5746226360b\") " pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.735168 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh\") pod \"infrawatch-operators-dtwsv\" (UID: \"05bace8f-ac1e-4ffa-9ab5-b5746226360b\") " pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:37 crc kubenswrapper[4745]: I0319 00:45:37.813600 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:38 crc kubenswrapper[4745]: I0319 00:45:38.051248 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:38 crc kubenswrapper[4745]: I0319 00:45:38.067040 4745 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 00:45:38 crc kubenswrapper[4745]: I0319 00:45:38.163711 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-dtwsv" event={"ID":"05bace8f-ac1e-4ffa-9ab5-b5746226360b","Type":"ContainerStarted","Data":"d0f104d47461d5e27eab6d07de1f3892f4a281be929351000c1c470b2a8f3b00"} Mar 19 00:45:39 crc kubenswrapper[4745]: I0319 00:45:39.175217 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-dtwsv" event={"ID":"05bace8f-ac1e-4ffa-9ab5-b5746226360b","Type":"ContainerStarted","Data":"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b"} Mar 19 00:45:39 crc kubenswrapper[4745]: I0319 00:45:39.206795 4745 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/infrawatch-operators-dtwsv" podStartSLOduration=2.090009515 podStartE2EDuration="2.206761505s" podCreationTimestamp="2026-03-19 00:45:37 +0000 UTC" firstStartedPulling="2026-03-19 00:45:38.066796127 +0000 UTC m=+2302.604991258" lastFinishedPulling="2026-03-19 00:45:38.183548117 +0000 UTC m=+2302.721743248" observedRunningTime="2026-03-19 00:45:39.192781795 +0000 UTC m=+2303.730976926" watchObservedRunningTime="2026-03-19 00:45:39.206761505 +0000 UTC m=+2303.744956646" Mar 19 00:45:43 crc kubenswrapper[4745]: I0319 00:45:43.138152 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:45:43 crc kubenswrapper[4745]: E0319 00:45:43.139188 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:45:47 crc kubenswrapper[4745]: I0319 00:45:47.815136 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:47 crc kubenswrapper[4745]: I0319 00:45:47.815599 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:47 crc kubenswrapper[4745]: I0319 00:45:47.853168 4745 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:48 crc kubenswrapper[4745]: I0319 00:45:48.347200 4745 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:48 crc kubenswrapper[4745]: I0319 00:45:48.406440 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:50 crc kubenswrapper[4745]: I0319 00:45:50.266214 4745 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/infrawatch-operators-dtwsv" podUID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" containerName="registry-server" containerID="cri-o://09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b" gracePeriod=2 Mar 19 00:45:50 crc kubenswrapper[4745]: I0319 00:45:50.682439 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:50 crc kubenswrapper[4745]: I0319 00:45:50.768491 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh\") pod \"05bace8f-ac1e-4ffa-9ab5-b5746226360b\" (UID: \"05bace8f-ac1e-4ffa-9ab5-b5746226360b\") " Mar 19 00:45:50 crc kubenswrapper[4745]: I0319 00:45:50.776417 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh" (OuterVolumeSpecName: "kube-api-access-srxdh") pod "05bace8f-ac1e-4ffa-9ab5-b5746226360b" (UID: "05bace8f-ac1e-4ffa-9ab5-b5746226360b"). InnerVolumeSpecName "kube-api-access-srxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:45:50 crc kubenswrapper[4745]: I0319 00:45:50.870407 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srxdh\" (UniqueName: \"kubernetes.io/projected/05bace8f-ac1e-4ffa-9ab5-b5746226360b-kube-api-access-srxdh\") on node \"crc\" DevicePath \"\"" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.277975 4745 generic.go:334] "Generic (PLEG): container finished" podID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" containerID="09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b" exitCode=0 Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.278091 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/infrawatch-operators-dtwsv" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.278123 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-dtwsv" event={"ID":"05bace8f-ac1e-4ffa-9ab5-b5746226360b","Type":"ContainerDied","Data":"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b"} Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.278195 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/infrawatch-operators-dtwsv" event={"ID":"05bace8f-ac1e-4ffa-9ab5-b5746226360b","Type":"ContainerDied","Data":"d0f104d47461d5e27eab6d07de1f3892f4a281be929351000c1c470b2a8f3b00"} Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.278221 4745 scope.go:117] "RemoveContainer" containerID="09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.318856 4745 scope.go:117] "RemoveContainer" containerID="09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.321517 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:51 crc kubenswrapper[4745]: E0319 00:45:51.321536 4745 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b\": container with ID starting with 09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b not found: ID does not exist" containerID="09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.321676 4745 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b"} err="failed to get container status \"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b\": rpc error: code = NotFound desc = could not find container \"09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b\": container with ID starting with 09bc3b119c130d7bd126071884c3be3af26201614169b5e7fafda1121d9bcf4b not found: ID does not exist" Mar 19 00:45:51 crc kubenswrapper[4745]: I0319 00:45:51.330254 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/infrawatch-operators-dtwsv"] Mar 19 00:45:52 crc kubenswrapper[4745]: I0319 00:45:52.147700 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" path="/var/lib/kubelet/pods/05bace8f-ac1e-4ffa-9ab5-b5746226360b/volumes" Mar 19 00:45:54 crc kubenswrapper[4745]: I0319 00:45:54.137468 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:45:54 crc kubenswrapper[4745]: E0319 00:45:54.137854 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.155835 4745 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29564686-sn4q2"] Mar 19 00:46:00 crc kubenswrapper[4745]: E0319 00:46:00.158759 4745 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" containerName="registry-server" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.158906 4745 state_mem.go:107] "Deleted CPUSet assignment" podUID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" containerName="registry-server" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.159163 4745 memory_manager.go:354] "RemoveStaleState removing state" podUID="05bace8f-ac1e-4ffa-9ab5-b5746226360b" containerName="registry-server" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.159918 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.163417 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.163491 4745 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.163786 4745 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-g4ks9" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.168137 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564686-sn4q2"] Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.256281 4745 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrsgp\" (UniqueName: \"kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp\") pod \"auto-csr-approver-29564686-sn4q2\" (UID: \"69beb762-e553-4e8c-882e-6b3d18fc370d\") " pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.360274 4745 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrsgp\" (UniqueName: \"kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp\") pod \"auto-csr-approver-29564686-sn4q2\" (UID: \"69beb762-e553-4e8c-882e-6b3d18fc370d\") " pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.381535 4745 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrsgp\" (UniqueName: \"kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp\") pod \"auto-csr-approver-29564686-sn4q2\" (UID: \"69beb762-e553-4e8c-882e-6b3d18fc370d\") " pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.493272 4745 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:00 crc kubenswrapper[4745]: I0319 00:46:00.926978 4745 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29564686-sn4q2"] Mar 19 00:46:01 crc kubenswrapper[4745]: I0319 00:46:01.352107 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" event={"ID":"69beb762-e553-4e8c-882e-6b3d18fc370d","Type":"ContainerStarted","Data":"79d7b7915d396f6b39b4ced4a353a5161605187a9e1ff73bfc0b8de04c608ff7"} Mar 19 00:46:03 crc kubenswrapper[4745]: I0319 00:46:03.382736 4745 generic.go:334] "Generic (PLEG): container finished" podID="69beb762-e553-4e8c-882e-6b3d18fc370d" containerID="98e61b0a7067b97891990f60a8405193d8eeb8ab49971c766e8af3c11b3b1e1d" exitCode=0 Mar 19 00:46:03 crc kubenswrapper[4745]: I0319 00:46:03.383148 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" event={"ID":"69beb762-e553-4e8c-882e-6b3d18fc370d","Type":"ContainerDied","Data":"98e61b0a7067b97891990f60a8405193d8eeb8ab49971c766e8af3c11b3b1e1d"} Mar 19 00:46:04 crc kubenswrapper[4745]: I0319 00:46:04.628930 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:04 crc kubenswrapper[4745]: I0319 00:46:04.736618 4745 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrsgp\" (UniqueName: \"kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp\") pod \"69beb762-e553-4e8c-882e-6b3d18fc370d\" (UID: \"69beb762-e553-4e8c-882e-6b3d18fc370d\") " Mar 19 00:46:04 crc kubenswrapper[4745]: I0319 00:46:04.753171 4745 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp" (OuterVolumeSpecName: "kube-api-access-xrsgp") pod "69beb762-e553-4e8c-882e-6b3d18fc370d" (UID: "69beb762-e553-4e8c-882e-6b3d18fc370d"). InnerVolumeSpecName "kube-api-access-xrsgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 00:46:04 crc kubenswrapper[4745]: I0319 00:46:04.839191 4745 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrsgp\" (UniqueName: \"kubernetes.io/projected/69beb762-e553-4e8c-882e-6b3d18fc370d-kube-api-access-xrsgp\") on node \"crc\" DevicePath \"\"" Mar 19 00:46:05 crc kubenswrapper[4745]: I0319 00:46:05.404909 4745 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" event={"ID":"69beb762-e553-4e8c-882e-6b3d18fc370d","Type":"ContainerDied","Data":"79d7b7915d396f6b39b4ced4a353a5161605187a9e1ff73bfc0b8de04c608ff7"} Mar 19 00:46:05 crc kubenswrapper[4745]: I0319 00:46:05.404955 4745 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29564686-sn4q2" Mar 19 00:46:05 crc kubenswrapper[4745]: I0319 00:46:05.404963 4745 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79d7b7915d396f6b39b4ced4a353a5161605187a9e1ff73bfc0b8de04c608ff7" Mar 19 00:46:05 crc kubenswrapper[4745]: I0319 00:46:05.701845 4745 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:46:05 crc kubenswrapper[4745]: I0319 00:46:05.709328 4745 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29564680-w752m"] Mar 19 00:46:06 crc kubenswrapper[4745]: I0319 00:46:06.148105 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:46:06 crc kubenswrapper[4745]: E0319 00:46:06.148350 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" Mar 19 00:46:06 crc kubenswrapper[4745]: I0319 00:46:06.151677 4745 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2afbbf4f-151b-4d25-9658-58353102abde" path="/var/lib/kubelet/pods/2afbbf4f-151b-4d25-9658-58353102abde/volumes" Mar 19 00:46:17 crc kubenswrapper[4745]: I0319 00:46:17.138727 4745 scope.go:117] "RemoveContainer" containerID="26cf031263ba2ce39a17b9bc1f91a79f4c9bf57080b9988d3962cef68a16c6fa" Mar 19 00:46:17 crc kubenswrapper[4745]: E0319 00:46:17.139742 4745 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qt5t5_openshift-machine-config-operator(400972f4-050f-4f26-b982-ced6f2590c8b)\"" pod="openshift-machine-config-operator/machine-config-daemon-qt5t5" podUID="400972f4-050f-4f26-b982-ced6f2590c8b" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515156643556024465 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015156643556017402 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015156636500016514 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015156636500015464 5ustar corecore